pyramid13
pyramid13

Reputation: 286

logstash and kafka and graylog and order of messages

We have an application with the event log, our application send event log to Kafka with JSON format. our topic in Kafka has one partition because we need to read order message, also we use Logstash for consuming event log and convert JSON format to GELF format then send to Graylog. our problem is considering that the topic has a partition, but the consumer(Logstash) does not read the messages in order and our order is disturbed. we use

 stdout{ codec => rubydebug }

in output Logstash config and this Logstash log confirms the issue, Naturally, we don't have events in GaryLag in order. Why the order is messed up. Problems with Kafka or Logstash or Graylog or ...?

thanks.

UPDATE

logstash config:

input {
  kafka {
        bootstrap_servers => "kafka1:9092,kafka2:9092,kafka3:9092"
        group_id => "MyTopicReader"
        topics => "MyTopic"
        consumer_threads => 1
        enable_metric => "false"
        auto_offset_reset => "latest"
 }
}

filter {
   json {
     source => "message"
  }
}

output {
      gelf {

           host => "ip-graylog"
           port => 12201
           sender => "ip-logstash"

    }
 stdout{ codec => rubydebug }
}

pipline config pipelines.yml:

- pipeline.id: main
  path.config: "/etc/logstash/conf.d/*.conf"

Pipeline Settings in logstash.yml

pipeline.id: main
pipeline.workers: 2
pipeline.batch.size: 125
pipeline.batch.delay: 50

Upvotes: 0

Views: 1179

Answers (1)

Akhilesh Bharadwaj
Akhilesh Bharadwaj

Reputation: 946

Start Logstash with -w 1 to run a single pipeline worker.

Upvotes: 1

Related Questions