helpper
helpper

Reputation: 2566

How to run multi pipeline in logstash using The Elastic stack (ELK) powered by Docker and Compose

I am using this_repo to get started running ELK with Docker.

my question is regarding the logstash image in the docker-compose file:

When I run locally I have 3 files

#general settings
logstash.yml
#pipeline setting
pipeline.yml
#a pipe line configuration
myconf.conf1

when I want to use multi pipeline I use the pipeline.yml file to control all the different pipeline I am running

# Example of my pipeline.yml
- pipeline.id: my-first-pipeline
  path.config: "/etc/logstash/my-first-pipeline.config"
  pipeline.workers: 2
- pipeline.id: super-swell-pipeline
  path.config: "/etc/logstash/super-swell-pipeline.config"
  queue.type: persisted

In the repo I am using as guideline I can only find logstash.yml and I am not understanding how to can I add pipelines. the only running pipeline is the default "main" which by default only run logstash.conf I tried different configurations, all field

How can I add pipeline.yml to the docker? or what is the best practice running multi pipelines with this docker-compose file?

appreciate any help

docker-compose/logstash form the repo:

  logstash:
    build:
      context: logstash/
      args:
        ELK_VERSION: $ELK_VERSION
    volumes:
      - type: bind
        source: ./logstash/config/logstash.yml
        target: /usr/share/logstash/config/logstash.yml
        read_only: true
      - type: bind
        #can be either a host path or volume name.
        source: ./logstash/pipeline
        #is the container path where the volume is mounted
        target: /usr/share/logstash/pipeline
        read_only: true
    ports:
      - "5000:5000/tcp"
      - "5000:5000/udp"
      - "9600:9600"
    environment:
      LS_JAVA_OPTS: "-Xmx256m -Xms256m"
    networks:
      - elk
    depends_on:
      - elasticsearch

DockerFILE:

ARG ELK_VERSION

# https://www.docker.elastic.co/
FROM docker.elastic.co/logstash/logstash:${ELK_VERSION}

# Add your logstash plugins setup here
# Example: RUN logstash-plugin install logstash-filter-json

logstash.yml

## Default Logstash configuration from Logstash base image.
## https://github.com/elastic/logstash/blob/master/docker/data/logstash/config/logstash-full.yml
#
http.host: "0.0.0.0"
xpack.monitoring.elasticsearch.hosts: [ "http://elasticsearch:9200" ]

## X-Pack security credentials
#
xpack.monitoring.enabled: false
xpack.monitoring.elasticsearch.username: elastic
xpack.monitoring.elasticsearch.password: changeme

Upvotes: 1

Views: 7842

Answers (1)

apt-get_install_skill
apt-get_install_skill

Reputation: 2908

You need to mount your pipelines.yml file to the container as well. The default location Logstash is looking for a possible pipelines.yml file is /usr/share/logstash/config/ (the same folder you've already mounted the logstash.yml file to).

Please note that you also have to update your current, local pipelines.yml file to the correct paths of the pipelines inside the container. To be precise, you need to change

path.config: "/etc/logstash/my-first-pipeline.config"

to

path.config: "/usr/share/logstash/pipeline/my-first-pipeline.config"

Also, have a look at these official guides for running Logstash with Docker and how to configure multiple pipelines:

I hope I could help you!

EDIT:

The official documentations call the file pipelines.yml instead of pipeline.yml

Upvotes: 9

Related Questions