Reputation: 1815
Sometimes it would make sense to serialize it across with rest of data. Right now we have to unwrap it.
Edit: is there some other option available? E.g. Apache commons lang has MutableInt which is lightweight wrapper around primitive int?
Upvotes: 4
Views: 2535
Reputation: 3157
I answered in the next post with an example for custom serialization: How to serialize ByteBuffer
Basically the point is that ByteBuffer is a wrapper for byte arrays so it has no sense to serialize it (instead you should serialize the byte[]). Just in case you need to serialize a ByteBuffer then... do something like
public class NetByteBuffer implements java.io.Serializable {
private static final long serialVersionUID = -2831273345165209113L;
//serializable property
String anotherProperty;
// mark as transient so this is not serialized by default
transient ByteBuffer data;
public NetByteBuffer(String anotherProperty, ByteBuffer data) {
this.data = data;
this.anotherProperty = anotherProperty;
}
public ByteBuffer getData() {
return this.data;
}
private void writeObject(ObjectOutputStream out) throws IOException {
// write default properties
out.defaultWriteObject();
// write buffer capacity and data
out.writeInt(data.capacity());
out.write(data.array());
}
private void readObject(ObjectInputStream in) throws IOException, ClassNotFoundException {
//read default properties
in.defaultReadObject();
//read buffer data and wrap with ByteBuffer
int bufferSize = in.readInt();
byte[] buffer = new byte[bufferSize];
in.read(buffer, 0, bufferSize);
this.data = ByteBuffer.wrap(buffer, 0, bufferSize);
}
public String getAnotherProperty() {
return anotherProperty;
}
}
Upvotes: 3
Reputation: 24202
My guess would be that since the contents of a ByteBuffer are already a blob, and so reading/writing them to/from streams/channels is not a complicated matter the designers of the language saw no need to make ByteBuffers serializable.
You could in theory make your own Externalizable ByteBuffer impl, something like:
package java.nio; //has to be in java.nio pkg, _get() and _put and pkg-private
public class SerializableByteBuffer extends ByteBuffer implements Externalizable {
private ByteBuffer theActualBuffer;
public SerializableByteBuffer(ByteBuffer theActualBuffer) {
super(0, 0, 1, 1);
this.theActualBuffer = theActualBuffer;
}
// these 2 are package private. this was obviously not designed to be extended
@Override
byte _get(int i) {
return theActualBuffer._get(i);
}
@Override
void _put(int i, byte b) {
theActualBuffer._put(i, b);
}
@Override
public void writeExternal(ObjectOutput out) throws IOException {
//write length + type of underlying buffer (enum?) + contents
}
@Override
public void readExternal(ObjectInput in) throws IOException, ClassNotFoundException {
//read length and type of buffer, instantiate buffer of correct type, read contents into buffer
}
//delegate all methods. this is going to be a lot of work as some return buffer copies
}
but given that you'd have to place it in the java.nio package, and correctly delegate ~20 methods (some of which are tricky) it'll be a lot of hard work and the results will never be pretty.
also, the actual (de)serialization will never be truly efficient as there's no way (that i know of?) to get a Channel
from an ObjectOutput
, which means you'll need to do it the old fashioned way with an intermediate byte[4096] buffer or something
Upvotes: 2
Reputation: 73568
Does it really matter why? Although there's definitely a potential issue when direct buffers are used.
If you need to do this just occasionally, you can write your own writeObject()
/readObject()
implementations for handling the serialization.
Upvotes: 1