Lee Sai Mun
Lee Sai Mun

Reputation: 310

How to import CSV or JSON data into Elasticsearch using deviantony/docker-elk

I just started picking up elasticsearch and docker a few days ago and I had some trouble ingesting data into elasticsearch. The elastic stack repo that I am using is this: https://github.com/deviantony/docker-elk

I tried to follow this tutorial that I found online: https://www.bmc.com/blogs/elasticsearch-load-csv-logstash/ but could not find any indexes when I load up kibana.

Here is what I did. I downloaded a sample data and stored it inside a folder called data under the root directory. In the docker-compose.yml file, I made a bind-mount that points to my external data folder.

elasticsearch:
    build:
      context: elasticsearch/
      args:
        ELK_VERSION: $ELK_VERSION
    volumes:
      - type: bind
        source: ./elasticsearch/config/elasticsearch.yml
        target: /usr/share/elasticsearch/config/elasticsearch.yml
        read_only: true
      - type: bind
        source: ./data
        target: /usr/share/elasticsearch/data
    ports:
      - "9200:9200"
      - "9300:9300"
    environment:
      ES_JAVA_OPTS: "-Xmx256m -Xms256m"
      ELASTIC_PASSWORD: password
      # Use single node discovery in order to disable production mode and avoid bootstrap checks
      # see https://www.elastic.co/guide/en/elasticsearch/reference/current/bootstrap-checks.html
      discovery.type: single-node
    networks:
      - elk

And under my logstash.conf file. This is what I have changed:

input {
  tcp {
    port => 5000
  }
  file {
    path => "/usr/share/elasticsearch/data/conn250K.csv"
    start_position => "beginning"
  }
}

filter {
  csv {
    columns => [ "record_id", "duration", "src_bytes", "dest_bytes" ]
  }
}

output {
  elasticsearch {
    hosts => "elasticsearch:9200"
    user => "elastic"
    password => "password"
    index => "network"
  }
}

After firing up, "docker-compose up" command in the terminal, I could not find any index pattern to create in Kibana as there are no indexes generated. I can't figure out what is wrong.

Upvotes: 2

Views: 2236

Answers (1)

Trial
Trial

Reputation: 46

Try bind-mounting the external data folder to the logstash container instead of elasticsearch.

  logstash:
    build:
      context: logstash/
      args:
        ELK_VERSION: $ELK_VERSION
    volumes:
      - type: bind
        source: ./logstash/config/logstash.yml
        target: /usr/share/logstash/config/logstash.yml
        read_only: true
      - type: bind
        source: ./logstash/pipeline
        target: /usr/share/logstash/pipeline
        read_only: true
      - type: bind
        source: ./data
        target: /usr/share/logstash/data
        read_only: true
    ports:
      - "5000:5000/tcp"
      - "5000:5000/udp"
      - "9600:9600"
    environment:
      LS_JAVA_OPTS: "-Xmx256m -Xms256m"
    networks:
      - elk
    depends_on:
      - elasticsearch

Upvotes: 2

Related Questions