Mostafa Ghadimi
Mostafa Ghadimi

Reputation: 6766

Convert Oracle NUMBER data type to Clickhouse valid data types

I am very new to Oracle CDC and how it works. Based on the Oracle documentation, I found that it could be very dynamic about the precision and scale (since it is not the fixed one like Number(s, p)).

The type of data when it is transferred using CDC is as below:

{
    "name": "USER_CODE",
    "type": {
        "type": "record",
        "namespace": "io.debezium.data",
        "name": "VariableScaleDecimal",
        "fields": [
            {
                "name": "scale",
                "type": "int"
            },
            {
                "name": "value",
                "type": "bytes"
            }
        ]
    }
}

The problem is that whenever I want to read it on Clickhouse, using Kafka Table Engine, I got the following error:

Type  Float64 is not compatible with Avro record:
"type": "record",
"namespace": "io.debezium.data",
"name": "VariableScaleDecimal",
"fields": [
    {
        "name": "scale",
        "type": "int"
    },
    {
        "name": "value",
        "type": "bytes"
    }
]

P.S.: I also used other types of data like Dynamic and Decimal, but I get the same error.

Question: Can some help me please to resolve this issue?

Upvotes: 0

Views: 53

Answers (1)

Mostafa Ghadimi
Mostafa Ghadimi

Reputation: 6766

There are several ways to resolve this issue:

  1. Use decimal.handling.mode option in connector configuration:

    The default value for this configuration is precise, which makes the data type inference to be VariableScaleDecimal. The other useful values for this configuration are:

    • double
    • string

    Each of them can be used for the different use-cases. I used double, since I didn't have direct access to Oracle and couldn't change the data type.

  2. Use available SMTs like Cast:

    This configuration does not support VariableScaleDecimal or Struct (Debezium) data type.

  3. Write custom SMT for casting purpose: This approach is the most flexible one, but it makes the process more complicated. I recommend this approach to the ones who know Java programming language.

After following the above steps, the remaining one is to set correct data types for your columns in Clickhouse table. To ensure about the correctness of data types, you can either check Kafka topic or Schema Registry (if configured).

Upvotes: 0

Related Questions