user_1357
user_1357

Reputation: 7940

Kafka Avro Console Consumer/Producer from and to a File

Is it possible to save Avro encoded messages to a file using Avro Console Consumer and reading from this file to write to another topic using Avro Console Producer given I have a schema string available to provide to the console consumer and producer? Is this a use cases that are supported out of box or do I need to write a shell script for it?

Upvotes: 1

Views: 978

Answers (1)

OneCricketeer
OneCricketeer

Reputation: 191844

The Avro console producer only accepts JSON encoded strings, not Avro files. However, given an Avro file, you can dump its records as JSON and AVSC schema, then it's possible to parse into the console producer (however, it doesn't work in all cases, in my experience).

You can use Kafka Connect S3 or HDFS sinks (or Apache Nifi) to consume Schema Registry encoded data and write into local Avro files.

You can MinIO or Hadoop Ozone to emulate a local S3 endpoint, or you can use the file:// URI prefix with the HDFS connector to write to local disk

There's no HDFS source Kafka connector that I know of, but Confluent does provide an S3 source connector. Nifi can read and write both file locations, so maybe start with it

Flink or Spark might also work, but have less straightforward Avro serializer settings to make it back into Schema Registry format

Upvotes: 1

Related Questions