user11725421
user11725421

Reputation:

JsonMappingException: Not a map, not an array or not an enum

I am working on a rest api application and i have some problems with the code. I want to return an object that extends avro schema and adds HATEOAS links to the response. I did some investigation and as it turns out i need custom serializer for object.

Part of code that is problematic is:

@JsonValue
public String serialize() throws JsonProcessingException {
    ObjectMapper mapper = new ObjectMapper();
    return mapper.writeValueAsString(this);
}

This returns

Could not write JSON: Not an enum.

The current object is an extend of avro schema. I also tried:

@JsonValue
public String serialize() throws JsonProcessingException {
    ObjectMapper mapper = new ObjectMapper();
    return mapper.writeValueAsString(getUserData());
}

It returns:

Could not write JSON: Not an array

Where user data is an actual avro object. I dont understand what do these error mean. Can someone explain? Also, is there a better way to return an avro object combined with other parameters?

Thanks

EDIT:

Here is the full example:

public class PaginatedUserData extends UserDataAvro {
    @JsonValue
    public String serialize() throws JsonProcessingException {
        ObjectMapper mapper = new ObjectMapper();
        HashMap<String, String> map = new HashMap<>();
        map.put("test", "5");
        return mapper.writeValueAsString(map);
    }
}

Returns:

Could not write JSON: Not an array

Upvotes: 8

Views: 14361

Answers (3)

Renato Gama
Renato Gama

Reputation: 16519

I had the same problem when serializing a Kafka message out of an @AvroGenerated class (Kotlin with Micronaut framework). What worked for me was making sure to use an Avro serializer, like this:

    override fun getProperties() = mutableMapOf(
        "kafka.bootstrap.servers" to kafka.bootstrapServers,
        "kafka.producers.default.value.serializer" to KafkaAvroSerializer::class.java.name,
        "kafka.producers.default.key.serializer" to StringSerializer::class.java.name,
        "kafka.producers.default.schema.registry.url" to "mock://my-scope-name",
        "kafka.consumers.default.schema.registry.url" to "mock://my-scope-name",
        "kafka.consumers.default." + KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG to "true"
    )

Upvotes: 1

Harald Brabenetz
Harald Brabenetz

Reputation: 494

I had a similar problem, and solved it with Jackson-Mixin (from https://stackoverflow.com/a/7422285/702345 ). Avro seems to generate Avro specifc Getter Methods which should always be ignored during Jackson serialization.

public abstract class JacksonIgnoreAvroPropertiesMixIn {

  @JsonIgnore
  public abstract org.apache.avro.Schema getSchema();

  @JsonIgnore
  public abstract org.apache.avro.specific.SpecificData getSpecificData();
}

And on the ObjectMapper than:

objectMapper.addMixIn(
        org.apache.avro.specific.SpecificRecord.class, // Interface implemented by all generated Avro-Classes
        JacksonIgnoreAvroPropertiesMixIn.class);

Upvotes: 14

Matej J
Matej J

Reputation: 643

Jackson can be problematic sometimes. These error mean that spring boot expects data different than in your result.

You should return object that has an avro schema as the property, like so:

class UserResponse extends ResourceSupport {
  UserData userData;
  public void getUserData() { ... }
}

Then you return this in controller:

return new UserResponse();

If this still does not work for you, i suggest you use Gson. For your application it would look something like this:

Gson gson = new Gson();
UserResponse data = new UserResponse();

String json = gson.toJson(data.getUserData());

Upvotes: 1

Related Questions