Piotr Kozlowski
Piotr Kozlowski

Reputation: 1031

Kafka Connect JDBC sink date/time conversion to PostgreSQL timestamp with timezone

We have a Kafka topic with Cloud Events in JSON Schema format that we connected to Timescaledb/PostgreSQL with JDBC sink.

To convert a table to a hypertable we need a time column, but our date time format in the event generated by golang app contains nanoseconds: 2024-06-23T19:36:00.104215111Z and is not compatible with PostgreSQL timestamp with time zone format so Kafka Connect returns an error like this:

Batch entry 0 INSERT INTO \"test3_new\" (\"allowed\",\"specversion\",\"id\",\"source\",\"time\",\"type\",\"relation\",\"username\",\"object\") VALUES (('TRUE'),('1.0'),('1718828631'),('auth/v1/check'),('2024-06-19T20:23:51.18607753Z'),('auth.check'),('reader'),('user:user1592'),('cluster:example-cluster-us-east')) 
was aborted: ERROR: column \"time\" is of type timestamp with time zone but expression is of type character varying\n  
Hint: You will need to rewrite or cast the expression.

I couldn't find a way to convert data to a proper format like using Kafka Connect transforms or some built-in Timescale/Postgres functions for auto conversion that could work here.

Upvotes: 0

Views: 162

Answers (1)

Piotr Kozlowski
Piotr Kozlowski

Reputation: 1031

In the meantime I found this issue: https://github.com/confluentinc/kafka-connect-jdbc/issues/921 and looks like adding ?stringtype=unspecified at the end of connection string works fine.

Upvotes: 1

Related Questions