Sumit Deo
Sumit Deo

Reputation: 11

How to make Instaclustr Kafka Sink Connector work with Avro serialized value to postgres?

I have a Kafka topic of Avro-serialized value.

I am trying to set up a JDBC(postgres) sink connector to dump these messages in the postgres table.

But, I am getting below error.

"org.apache.kafka.common.config.ConfigException: Invalid value io.confluent.connect.avro.AvroConverter for configuration value.converter: Class io.confluent.connect.avro.AvroConverter could not be found."

My Sink.json is

{"name": "postgres-sink",
  "config": {
    "connector.class":"io.confluent.connect.jdbc.JdbcSinkConnector",
    "tasks.max":"1",
    "topics": "<topic_name>",
    "key.converter": "org.apache.kafka.connect.storage.StringConverter",
    "value.converter": "io.confluent.connect.avro.AvroConverter",
    "value.converter.schema.registry.url": "instaclustr_schema_registry_host:8085",
    "connection.url": "jdbc:postgresql://postgres:5432/postgres?currentSchema=local",
    "connection.user": "postgres",
    "connection.password": "postgres",
    "auto.create": "true",
    "auto.evolve":"true",
    "pk.mode":"none",
    "table.name.format": "<table_name>"
  }
}

Also, I have made changes in the connect-distributed.properties(bootstrap servers).

The command I am running is -

curl -X POST -H "Content-Type: application/json" --data @postgres-sink.json https://<instaclustr_schema_registry_host>:8083/connectors

Upvotes: 1

Views: 477

Answers (1)

Robin Moffatt
Robin Moffatt

Reputation: 32090

io.confluent.connect.avro.AvroConverter is not part of the Apache Kafka distribution. You can either just run Apache Kafka as part of Confluent Platform (which ships with the converter and is easier) or you can download it separately and install it yourself.

Upvotes: 2

Related Questions