Reputation: 405
I use kryo serializer as a serializer in project works with spark and written in scala. I register all the classes I use in my project and there is one class which is not serialize or desiralize which is linked hash map. Linked hash map registration :
kryo.register(scala.collection.mutable.linkedHashMap[_, _])
I get my data from elastic and the runtime type of properties is linkedHashMap. For example document look like that : person{age : 20, name : "yarden"}, map to linkedHashMap[String, Any] and if there is another object in person etc.
When I want to deserialize Data which is stored as linkedHashMap(for example collect RDD) the result is just empty object.
I try to use the MapSerializer(another parameter in register function) but it is fail because it is for java linkedHashMap. I try to search for suitable serializer for scala linkedHashMap but I didn't found. There is a solution of every time I get a LinkedHashMap convert it to Map and it is work but it is not a good practice. I think about maybe there is a way to cause runtime type to be map and not linkedHashMap but I didn't find any solution.
I Believe the best practice is to found serializer which is suitable to linkedHashMap of scala but I didn't found any.
Any solution of the things I didn't succeed to solve will be welcomed.
Upvotes: 1
Views: 1013
Reputation: 21
Try using the TraversibleSerializer.
For example:
kryo.register(classOf[mutable.LinkedHashMap[Any, Any]],
new TraversableSerializer[(Any, Any), mutable.LinkedHashMap[Any, Any]](true))
Upvotes: 1