user1652054
user1652054

Reputation: 465

How to parse data from S3 using Logstash and push to Elastic Search and then to Kibana

I have a log file created in S3 bucket every minute. The data is "\x01" delimited. One of the column is a timestamp field.

I want to load this data into elastic search.

I tried using the following logstash conf. But it doesn't seem to work. I don't see any output. I took some reference from http://brewhouse.io/blog/2014/11/04/big-data-with-elk-stack.html

Logstash config file is as follows:

input {
  s3 {
    bucket => "mybucketname"
    credentials => [ "accesskey", "secretkey" ]
  }
}
filter {
  csv {
   columns => [ "col1", "col2", "@timestamp" ]
   separator => "\x01"
  }
}
output {
  stdout { } 
}

How do I modify this file to take in new file coming in every minute?

I would then eventually want to connect Kibana to ES to visualize the changes.

Upvotes: 4

Views: 9455

Answers (1)

Vova Lando
Vova Lando

Reputation: 558

Just use logstash-forwarder to send the files from S3, you will have to generate certificates for authorization.

There is a really nice tutorial: https://www.digitalocean.com/community/tutorials/how-to-use-logstash-and-kibana-to-centralize-logs-on-centos-7

if you getting I/O errors, mb you can solve them by setting cluster:

inside logstash.conf:

output {
    elasticsearch {
        host => "127.0.0.1"
        cluster => CLUSTER_NAME
    }

inside elasticsearch.yml:

cluster.name: CLUSTER_NAME

if you getting problems generating certificates, you can generate them using this: https://raw.githubusercontent.com/driskell/log-courier/develop/src/lc-tlscert/lc-tlscert.go

I also found better init.d for logstash-forwarder on CentOS: http://smuth.me/posts/centos-6-logstash-forwarder-init-script.html

Upvotes: 1

Related Questions