Reputation: 91
I am getting a a duplicate key exception while parsing JSON response containing timestamps as keys using GSON. It gives the following error:
com.google.gson.JsonSyntaxException: duplicate key: 1463048935 at com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter.read(MapTypeAdapterFactory.java:186) at com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter.read(MapTypeAdapterFactory.java:141)
How do I make it ignore the duplicate entries, and just parse it to a map with any one from the duplicate entries?
Upvotes: 5
Views: 9291
Reputation: 1
I couldn't use Kotlin (as answered before), so I adjusted it to Java It could be achieved by registering the type adder:
@Test
void testDuplicatesIgnored() {
String json = "{\"key\": \"value\", \"key\": \"value2\"}";
Gson gson = new GsonBuilder()
.registerTypeAdapter(Map.class, new JsonDeserializer<Map<String, Object>>() {
@Override
public Map<String, Object> deserialize(JsonElement json1, Type typeOfT, JsonDeserializationContext context) throws JsonParseException {
return new Gson().fromJson(json1, typeOfT);
}
})
.create();
Type mapType = new TypeToken<Map<String, Object>>() {}.getType();
Map<String, Object> map = gson.fromJson(json, mapType);
System.out.println("map = " + map); // map = {key=value2}
assertThat(map).hasSize(1);
assertThat(map.get("key")).isEqualTo("value2");
}
This way all the mappings to Map.class
will go through your deserializer code
Yea, looks like a dirty hack, but it works
Another way is to register type adder for your custom type to make the deserializer being called only where you need it:
@Test
void testDuplicatesIgnored() {
String json = "{\"key\": \"value\", \"key\": \"value2\"}";
Gson gson = new GsonBuilder()
.registerTypeAdapter(MapIgnoringDuplicatesContainer.class, new JsonDeserializer<MapIgnoringDuplicatesContainer>() {
@Override
public MapIgnoringDuplicatesContainer deserialize(JsonElement json, Type typeOfT, JsonDeserializationContext context) throws JsonParseException {
Type mapType = new TypeToken<Map<String, Object>>() {}.getType();
return new MapIgnoringDuplicatesContainer(new Gson().fromJson(json, mapType));
}
})
.create();
Map<String, Object> map = gson.fromJson(json, MapIgnoringDuplicatesContainer.class).getMap();
System.out.println("map = " + map); // map = {key=value2}
assertThat(map).hasSize(1);
assertThat(map.get("key")).isEqualTo("value2");
}
private class MapIgnoringDuplicatesContainer {
private Map<String, Object> map;
public MapIgnoringDuplicatesContainer(Map<String, Object> map) {
this.map = map;
}
public Map<String, Object> getMap() {
return map;
}
}
```
Upvotes: 0
Reputation: 5911
Hackerman solution, tested and working using GSON v2.8.5, but use at your own risk! Whenever you update GSON to a new version, make sure to check whether this is still working!
Basically, you can use the fact that the generic ObjectTypeAdapter ignores duplicates as seen here:
Looks like MapTypeAdapterFactory checks for duplicate
V replaced = map.put(key, value);
if (replaced != null) {
throw new JsonSyntaxException("duplicate key: " + key);
}
however ObjectTypeAdapter does not
case BEGIN_OBJECT:
Map<String, Object> map = new LinkedTreeMap<String, Object>();
in.beginObject();
while (in.hasNext()) {
map.put(in.nextName(), read(in));
}
in.endObject();
return map;
What you can do now is trying to deserialize using fromJson
as usual, but catch the "duplicate key" exception, deserialize as a generic Object, which will ignore duplicates, serialize it again, which will result in a JSON string without duplicate keys, and finally deserialize using the correct type it's actually meant to be.
Here is a Kotlin code example:
fun <T> String.deserialize(gson: Gson, typeToken: TypeToken<T>): T {
val type = typeToken.type
return try {
gson.fromJson<T>(this, type)
} catch (e: JsonSyntaxException) {
if (e.message?.contains("duplicate key") == true) {
gson.toJson(deserialize(gson, object : TypeToken<Any>() {})).deserialize(gson, typeToken)
} else {
throw e
}
}
}
Obviously, this adds (potentially heavy) overhead by requiring 2 deserializations and an additional serialization, but currently I don't see any other way to do this. If Google decides to add an option for a built-in way to ignore duplicates, as suggested here, better switch to that.
Upvotes: 2