Preston
Preston

Reputation: 8177

Minimal beats => logstash => elastic not receiving logs in elastic (docker)

TLDR

I'm trying to use a dockerized Elastic stack to parse 2 log files. The stack takes logs added to files in /usr/share/filebeat/scrape_logs and store them in elasticsearch via logstash.

I see the logs arrive in logstash, they are displayed as below, however when i run the query GET /_cat/indices/ from kibana, no index is present.

I've created a github repo here with the relevant setup, if you would like to run the code, simply run docker-compose up, then echo '2021-03-15 09:58:59,255 [INFO] - i am a test' >> beat_test/log1.log to add additional logs.

Why do i not see an index created in elasticsearch? And why are the logs not indexed?

Detail

logstash         | {
logstash         |           "host" => {
logstash         |         "name" => "b5bd03c1654c"
logstash         |     },
logstash         |     "@timestamp" => 2021-03-15T22:09:06.220Z,
logstash         |            "log" => {
logstash         |           "file" => {
logstash         |             "path" => "/usr/share/filebeat/scrape_logs/log1.log"
logstash         |         },
logstash         |         "offset" => 98
logstash         |     },
logstash         |          "input" => {
logstash         |         "type" => "log"
logstash         |     },
logstash         |           "tags" => [
logstash         |         [0] "beats_input_codec_plain_applied"
logstash         |     ],
logstash         |            "ecs" => {
logstash         |         "version" => "1.6.0"
logstash         |     },
logstash         |       "@version" => "1",
logstash         |          "agent" => {
logstash         |                 "name" => "b5bd03c1654c",
logstash         |                 "type" => "filebeat",
logstash         |         "ephemeral_id" => "e171b269-2364-47ff-bc87-3fe0bd73bf8c",
logstash         |              "version" => "7.11.2",
logstash         |             "hostname" => "b5bd03c1654c",
logstash         |                   "id" => "97aaac06-c87f-446f-aadc-8187b155e9e9"
logstash         |     },
logstash         |        "message" => "2021-03-15 09:58:59,255 [INFO] - i am a test"
logstash         | }

docker-compose.yml

version: '3.6'
services:
  elasticsearch:
    image: elasticsearch:7.11.1
    container_name: elasticsearch
    environment:
      - bootstrap.memory_lock=true
      - "ES_JAVA_OPTS=-Xms2g -Xmx2g"
      - discovery.type=single-node
    ports: ['9200:9200']
    volumes:
      - ./es_data:/usr/share/elasticsearch/data

  kibana:
    image: kibana:7.11.1
    container_name: kibana
    ports: ['5601:5601']
    depends_on: ['elasticsearch']

  logstash:
    image: logstash:7.11.1
    container_name: logstash
    volumes:
      - ./scrape_logs.conf:/usr/share/logstash/config/scrape_logs.conf
    depends_on: ['elasticsearch']

  filebeat:
    image: docker.elastic.co/beats/filebeat:7.11.2
    container_name: filebeat
    user: root
    command: --strict.perms=false -e
    volumes:
      - ./filebeat.yml:/usr/share/filebeat/filebeat.yml
      - /var/run/docker.sock:/var/run/docker.sock:ro
      - /var/lib/docker/containers:/var/lib/docker/containers:ro
      - ./beat_test:/usr/share/filebeat/scrape_logs
    depends_on: ['elasticsearch', 'kibana']

volumes:
  es_data:

scrape_logs.conf

input {
  beats {
    port => 5044
  }
}

output {
  elasticsearch {
    host => "elasticsearch:9200"
    index => "scrape_test"
  }
}

Upvotes: 0

Views: 707

Answers (1)

Val
Val

Reputation: 217274

The issue is because you need to map the Logstash pipeline configuration to the /usr/share/logstash/pipeline folder. The /usr/share/logstash/config folder is only useful for the settings.

If you don't specify that, there's a default /usr/share/logstash/pipeline/logstash.conf pipeline that does basically the following and that's why you're seeing the events in the Logstash console log:

input {
  beats {
    port => 5044
  }
}

output {
  stdout {
    codec => rubydebug
  }
}

So you need to replace the default pipeline by modifying your Logstash configuration to this:

  logstash:
    image: logstash:7.11.1
    container_name: logstash
    volumes:
      - ./pipeline:/usr/share/logstash/pipeline
    depends_on: ['elasticsearch']

You also need to create a folder called pipeline and move the scrape_logs.conf file into it.

Finally, you have a typo in the scrape_logs.conf file, the host setting in the elasticsearch output should be called hosts:

output {
  elasticsearch {
    hosts => "elasticsearch:9200"
    index => "scrape_test"
  }
}

Once all that is done, you can restart your docker stack, go into Kibana and you'll see your logs.

Upvotes: 1

Related Questions