Reputation: 129
I am using spark for get data from a topic kafka. I must deserialaizer avro data with KafkaAvroDeserialaizer. I config kafka consumer so:
kafkaParams.put("bootstrap.servers", "10.0.4.215:9092");
kafkaParams.put("key.deserializer", io.confluent.kafka.serializers.KafkaAvroDeserializer.class);
kafkaParams.put("value.deserializer",io.confluent.kafka.serializers.KafkaAvroDeserializer.class);
// kafkaParams.put("key.convert", com.datamountaineer.streamreactor.connect.converters.source.JsonSimpleConverter.class);
//kafkaParams.put("value.convert",com.datamountaineer.streamreactor.connect.converters.source.JsonSimpleConverter.class);
kafkaParams.put("group.id", "use_a_separate_group_id_for_each_stream");
kafkaParams.put("auto.offset.reset", "earliest");
kafkaParams.put("enable.auto.commit", false);
But when i execute the code i have this exception Exception in thread
"streaming-start" java.lang.NoClassDefFoundError: io/confluent/common/config/ConfigException
Can someone tell me where i can find this class def? For example maven dependency ext.
Upvotes: 4
Views: 7600
Reputation: 1
I had same issue. I used 5.1.0
confluent platform version. I checked compatibility kafka <-> confluent
and found that there are newer version in the same level of compatibility. I updated version up to 5.1.1
and it resolve that issue for me.
As an example:
Upvotes: 0
Reputation: 11
You need the following dependency: group: 'io.confluent', name: 'common-config', version: yourConfluentVersion
Upvotes: 1