J.Doe
J.Doe

Reputation: 549

Kafka JDBC Sink connector with json messages without schema

I am trying to load json messages into a Postgres database, using the Postgres sink connector. I have been reading online and have only found the option to have the schema in the JSON message, however, ideally, i would like not to include the schema in the message. Is there a way to register the JSON schema in the schema registry and use that like it's done with Avro?

Also, i'm currently running kafka by downloading the bin, as I had several problems with running kafka connect with docker due to ARM compatibility issues. Is there a similar install for schema registry? Because i'm only finding the option of downloading it through confluent and running it on docker. Is it possible to only run schema registry with docker, keeping my current set up?

Thanks

Upvotes: 0

Views: 1385

Answers (1)

OneCricketeer
OneCricketeer

Reputation: 191743

JSON without schema

The JDBC sink requires a schema

Is there a way to register the JSON schema in the schema registry and use that like it's done with Avro?

Yes, the Registry supports JSONSchema (as well as Protobuf), in addition to Avro. This requires you to use a specific serializer; you cannot just send plain JSON to the topic.

currently running kafka by downloading the bin... Is there a similar install for schema registry?

The Confluent Schema Registry is not a standalone package outside of Docker. You'd have to download Confluent Platform in place of Kafka and then copy over your existing zookeeper.properties and server.properties into that. Then run Schema Registry. Otherwise, compile it from source and build a standalone distribution of it with mvn -Pstandalone package

There are other registries that exist, such as Apicurio

Upvotes: 1

Related Questions