Reputation: 59
Very new to Kafka and getting very confused with multiple tutorials. Is schema registry required with any kafka avro set up? How to setup with a spring boot-Kafka project.
Upvotes: 3
Views: 4749
Reputation: 308
No, Confluent Schema Registry is not required to produce/consume Apache AVRO records in the key or value of a Kafka record.
Apache AVRO is a self-contained data container format, where a payload is always accompanied by its schema. In doing so, it is always possible for a reader to interpret the payload together with the schema.
Schema Registry is a supplementary service by Confluent which serves as a central registry for all AVRO schemas used within a given Kafka cluster environment.
Confluent provides Avro Schema Serializer and Deserializer. Serializer registers each schema with the registry and replaces the schema part of the AVRO record with a unique id returned by the registry. Deserializer on the other hand can query the schema from the registry using the same schema id. This saves storage in the cluster, because schemas are not repeated with each record. On the other hand, it increases network traffic and coupling.
On top, it can benefit consistent evolution of producer/consumer contracts, see Schema Evolution and Compatibility.
For integration of Schema Registry with Spring, I refer you to the documentation of Spring Kafka and Spring Cloud Schema Registry.
Upvotes: 2
Reputation: 32140
Is schema registry required with any kafka avro set up?
Kafka messages are just bytes. It's up to you how you serialise them. If you use Avro (or Protobuf, or JSON Schema) then you can use the Confluent Schema Registry which includes serialisers & deserialisers for these, and stores the schema for you whilst embedding in the actual message stored on Kafka a pointer to it.
In theory you could write raw Avro to Kafka and manage the .avsc
schema file yourself - this is in theory. In practice people just use the Schema Registry.
Ref:
How to setup with a spring boot-Kafka project.
Try this tutorial
Upvotes: 6
Reputation: 2169
In spring docs, you can find simple example to see and understand basic requirements to setup spring-kafka project. From maven repository to basic producer and consumer configs. Here is the link: A Very, Very Quick Example
In addition to above example, to setup your kafka APIs for Schema Registry, you need to have below configurations
For producer:
Properties props = new Properties();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class.getName());
props.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, "http://localhost:8081");
For consumer:
Properties props = new Properties();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ConsumerConfig.GROUP_ID_CONFIG, groupName);
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, KafkaAvroDeserializer.class.getName());
props.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, false);
props.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, "http://localhost:8081");
Above configs work for me when I use kafka_2.11-2.1.1
along with SR-5.1.2
. Hope this helps
Upvotes: 2