Itzblend
Itzblend

Reputation: 129

Kafka: creating stream from topic with values in separate columns

I just connected my kafka to postgres with a postgres source connector. Now when i print the topic i get following output:

rowtime: 4/1/20 4:16:12 PM UTC, key: <null>, value: {"userid": 4, "id": 5, "title": "lorem", "body": "dolor sit amet, consectetur"}
rowtime: 4/1/20 4:16:12 PM UTC, key: <null>, value: {"userid": 5, "id": 6, "title": "ipsum", "body": "cupidatat non proident"}

How do i make a stream from this topic so the values would be separated to their own columns as they were in the database table originally?

Bonus question: Is there any way to specify in jdbc-connector to separate the columns into the topic when creating source connector?

My connector looks like this:

curl -X POST http://localhost:8083/connectors -H "Content-Type: application/json" -d '{
        "name": "jdbc_source_postgres_02",
        "config": {
                "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
                "connection.url": "jdbc:postgresql://postgres:5432/kafka",
                "connection.user": "bob",
                "connection.password": "builder",
                "topic.prefix": "post_",
                "mode":"bulk",
                "table.whitelist" : "kafka_t.users_t",
                "poll.interval.ms" : 500
                }
        }'

Upvotes: 0

Views: 381

Answers (1)

Matthias J. Sax
Matthias J. Sax

Reputation: 62310

How do i make a stream from this topic so the values would be separated to their own columns as they were in the database table originally?

Not 100% sure what you mean by this. If you use Kafka Streams, you can for example create a KStream<KeyType, Columns> with a custom Columns type (or just use JSON as value type) to get an "column view" on your data.

Similarly, you could use ksqlDB with a CREATE STREAM command -- it can automatically parse the JSON value into corresponding columns.

Bonus question: Is there any way to specify in jdbc-connector to separate the columns into the topic when creating source connector?

What do you mean by that? Kafka topic have a key-value data model, and thus if you store any data in a topic, it must either go into the key or the value. If you have a more structured type, like a DB tuple, there is no native support in the Kafka brokers but you need to fit it into the key-value model.

Upvotes: 1

Related Questions