Zichen Ma
Zichen Ma

Reputation: 987

Avro: org.apache.avro.AvroTypeException: Expected long. Got START_OBJECT

I am working on an Avro schema and trying to create a testing data to test it with Kafa, but when I produce the message got this error: "Caused by: org.apache.avro.AvroTypeException: Expected long. Got START_OBJECT" The Schema I created is like this:

{
    "name": "MyClass",
    "type": "record",
    "namespace": "com.acme.avro",
    "doc":"This schema is for streaming information",
    "fields":[
        {"name":"batchId", "type": "long"},
        {"name":"status", "type": {"type": "enum", "name": "PlannedTripRequestedStatus", "namespace":"com.acme.avro.Dtos", "symbols":["COMPLETED", "FAILED"]}},
        {"name":"runRefId", "type": "int"},
        {"name":"tripId", "type": ["null", "int"]},
        {"name": "referenceNumber", "type": ["null", "string"]},
        {"name":"errorMessage", "type": ["null", "string"]}
    ]
}

The testing data is like this:

{
    "batchId": {
        "long": 3
    },
    "status": "COMPLETED",
    "runRefId": {
        "int": 1000
    },
    "tripId": {
        "int": 200
    },
    "referenceNumber": {
        "string": "ReferenceNumber1111"
    },
    "errorMessage": {
        "string": "Hello World"
    }
}

However, when I registered this schema and try to produce a message with Confluent console tool, I got the error: org.apache.avro.AvroTypeException: Expected long. Got START_OBJECT The whole error message is like this:

org.apache.kafka.common.errors.SerializationException: Error deserializing {"batchId": ...} to Avro of schema {"type":...}" at io.confluent.kafka.formatter.AvroMessageReader.readFrom(AvroMessageReader.java:134)
    at io.confluent.kafka.formatter.SchemaMessageReader.readMessage(SchemaMessageReader.java:325)
    at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:51)
    at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)
Caused by: org.apache.avro.AvroTypeException: Expected long. Got START_OBJECT
    at org.apache.avro.io.JsonDecoder.error(JsonDecoder.java:511)
    at org.apache.avro.io.JsonDecoder.readLong(JsonDecoder.java:177)
    at org.apache.avro.io.ResolvingDecoder.readLong(ResolvingDecoder.java:169)
    at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:197)
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:160)
    at org.apache.avro.generic.GenericDatumReader.readField(GenericDatumReader.java:259)
    at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:247)
    at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:179)
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:160)
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:153)
    at io.confluent.kafka.schemaregistry.avro.AvroSchemaUtils.toObject(AvroSchemaUtils.java:213)
    at io.confluent.kafka.formatter.AvroMessageReader.readFrom(AvroMessageReader.java:124)

Does any know what I did wrong with my schema or test data? Thank you so much!

Upvotes: 0

Views: 4415

Answers (1)

OneCricketeer
OneCricketeer

Reputation: 191681

You only need the type object if the type is unclear (union of string or number, for example), or its nullable.

For batchId and runRefId, just use simple values

Upvotes: 2

Related Questions