k.explorer
k.explorer

Reputation: 321

FileBeat not sending docker-container logs to Elastic-search

My requirement is to configure filebeat to be able to send logs to elastic search, the source for the file beats is docker container logs.

I am using docker to install filebeat, below mentioned are the dockerfile.filebeat.yml & docker-compose files that I have used for configuration.

DockerFile:

FROM docker.elastic.co/beats/filebeat:7.2.1
# Copy our custom configuration file
COPY filebeat.yml /usr/share/filebeat/filebeat.yml
USER root
# Create a directory to map volume with all docker log files
RUN mkdir /usr/share/filebeat/dockerlogs
RUN chown -R root /usr/share/filebeat/
RUN chmod -R go-w /usr/share/filebeat/

filebeat.yml

#filebeat.modules:
#- module: system
# syslog:
#   enabled: true
  #auth:
    #enabled: true
    #- module: auditd
    #log:
    # Does not look like Auditd is supported in Alpine linux: https://github.com/linuxkit/linuxkit/issues/52
    #enabled: false

filebeat.inputs:
- type: docker
  enabled: true
  containers:
    path: "/var/lib/docker/containers"
    stream: all # can be all, stdout or stderr
    ids:
      - '*'
  # exclude_lines: ["^\\s+[\\-`('.|_]"]  # drop asciiart lines
  # multiline.pattern: "^\t|^[[:space:]]+(at|...)|^Caused by:"
  # multiline.match: after

#========================== Filebeat autodiscover ==============================
# See this URL on how to run Apache2 Filebeat module: # https://www.elastic.co/guide/en/beats/filebeat/current/running-on-docker.html
#filebeat.autodiscover:
# providers:
#   - type: docker
      # https://www.elastic.co/guide/en/beats/filebeat/current/configuration-autodiscover-hints.html
      # This URL alos contains instructions on multi-line logs
      #     hints.enabled: true

#================================ Processors ===================================
processors:
#- add_cloud_metadata: ~
- add_docker_metadata: ~
- add_locale:
    format: offset
- add_host_metadata:
    netinfo.enabled: true

#========================== Elasticsearch output ===============================
output.elasticsearch:
  hosts: ["http://192.168.241.40:9200"]

docker-compose

version: '2'
services:
 filebeat:
  hostname: filebeat
# ** Here to build the image, you need to specify your own docker hub account :
  image: filebeat/img-1

  volumes:
# needed to persist filebeat tracking data :
   - "filebeat_data:/usr/share/filebeat/data:rw"
# needed to access all docker logs (read only) :
   - "/var/lib/docker/containers:/usr/share/dockerlogs/data:ro"
# needed to access additional informations about containers
   - "/var/run/docker.sock:/var/run/docker.sock"

volumes:
# create a persistent volume for Filebeat
 filebeat_data:

Using this configuration I am able to install file-beat in my machine, but when I run a service in the host machine I am unable to capture generated logs using file-beat and send them to Elastic-Search.

What could be a possible mistake that I am doing? Any help is appreciated.

Note: ElasticSearch<7.2.1> and kibana<7.2.1> are already installed in the same machine and I am able to open them through 192.168.241.40:9200

Upvotes: 4

Views: 4369

Answers (1)

YAO ALEX DIDIER AKOUA
YAO ALEX DIDIER AKOUA

Reputation: 249

version of filebeat: 7.12.0 You need to configure autodiscover, when you use docker logs.

filebeat.yml

# # =========================== Filebeat autodiscover ============================

filebeat.autodiscover:
  providers:
    - type: docker
      templates:
        - condition:
            contains:
              docker.container.image:<your_label_condition>
          config:
            - type: container
              paths:
                - "/var/lib/docker/containers/${data.docker.container.id}/*.log"
              exclude_lines: ["^\\s+[\\-`('.|_]"]

filebeat.shutdown_timeout: 5s   #optional

# ------------------------------- Console Output -------------------------------
output.console:
  enabled: true
  codec.json:
    pretty: true
    escape_html: false  

logging.metrics.enabled: false

I use console output in order to verify that all is ok before to send to logstash.

Upvotes: 1

Related Questions