dark ninja
dark ninja

Reputation: 15

How to configure logstash kafka with json deserializer? (Closed)

I have a config in my logstash. Where I need to consume the data from kafka topic and write into to elastic index. This is my conf

input {
  kafka{
    codec => json {}
    bootstrap_servers => "my_brokers"
    security_protocol => "SASL_PLAINTEXT"
    sasl_mechanism => "SCRAM-SHA-256"
    sasl_jaas_config => "my security config"
    topics => ["MY_TOPIC"]
    value_deserializer_class => "org.apache.kafka.common.serialization.JsonDeserializer"

  }
}

output {
  elasticsearch {
      hosts => ["localhost:9200"]
      index => "my_index"
  }
}

When I run my config, I'm getting this exception.

Error: Invalid value org.apache.kafka.common.serialization.JsonDeserializer for configuration value.deserializer: Class org.apache.kafka.common.serialization.JsonDeserializer could not be found.
  Exception: Java::OrgApacheKafkaCommonConfig::ConfigException
  Stack: org.apache.kafka.common.config.ConfigDef.parseType(org/apache/kafka/common/config/ConfigDef.java:728)
org.apache.kafka.common.config.ConfigDef.parseValue(org/apache/kafka/common/config/ConfigDef.java:474)
org.apache.kafka.common.config.ConfigDef.parse(org/apache/kafka/common/config/ConfigDef.java:467)

Did I miss anything in my conf?

Upvotes: 0

Views: 417

Answers (0)

Related Questions