Reputation: 9855
I am attempting to improve our current serialization performance by switching from the Serializable interface to Externalizable, but have not found a lot of documentation on best practice for creating custom and performant serialzations. My current solution is about twice as fast as the stock Java serialization, which while good, doesn't seem like the vast improvement I was expecting (Benchmark of serialization techniques/libraries)
For anything but primitives I've taken the approach of writing a 0 or 1 to show the field exists, then reading the field if the value is 1:
if (in.read() == 1) {
name = in.readUTF();
}
Does that sound about right? Are there better encodings to use? What about Maps, Lists, and other complex data structures. Is the default serialization for Enums fine?
Thanks.
Upvotes: 0
Views: 1091
Reputation: 533520
From a maintainability point of view, I try to use generated Data Transfer Objects. This way you generate the toString, hashCode, equals, readObject, writeObject and possibly their Builder classes as well from a single definition.
In terms of speed, it depends on what your raw data types are. There are three main costs in deserialization/deserialization
Upvotes: 0
Reputation: 1500535
Any reason not to use an existing serialization framework - but a rather better one than Java has built-in? My own preference is Protocol Buffers, but there are alternatives as well, such as Thrift. I'd try to avoid doing your own low-level serialization unless you really can't avoid it. The page you've linked to shows lots of alternatives.
You should consider both performance and maintainability. While Externalizable
can give you great performance, it depends on how you implement it, in the end - and you could do a good job, or a bad job... but it'll all be manual.
Upvotes: 2