Reputation: 1103
I am using protobuf-net for performance reasons in a scenario where the assemblies that deserialize saved data are the same that serialized it.
Most of the types that I serialize are simple contracts that I marked with ProtoContract and ProtoMember attributes but occasionally I have to serialize weird objects with many subclasses (ie: Exception).
I made it work with the following workaround using classic ISerializable mechanism.
I am pretty new to protobuf-net and would like to know if this is a good idea and if there are better/standard ways to do it.
My workaround:
I defined a generic surrogate that implements classic serialization
[ProtoContract]
class BinarySerializationSurrogate<T>
{
[ProtoMember(1)]
byte[] objectData = null;
public static implicit operator T(BinarySerializationSurrogate<T> surrogate)
{
T ret = default(T);
if (surrogate == null)
return ret;
var serializer = new BinaryFormatter();
using (var serializedStream = new MemoryStream(surrogate.objectData))
ret = (T)serializer.Deserialize(serializedStream);
return ret;
}
public static implicit operator BinarySerializationSurrogate<T>(T obj)
{
if (obj == null)
return null;
var ret = new BinarySerializationSurrogate<T>();
var serializer = new BinaryFormatter();
using (var serializedStream = new MemoryStream())
{
serializer.Serialize(serializedStream, obj);
ret.objectData = serializedStream.ToArray();
}
return ret;
}
}
In initialization code I add it as surrogate for weird base types
RuntimeTypeModel.Default
.Add(typeof(Exception), false)
.SetSurrogate(typeof(BinarySerializationSurrogate<Exception>));
Upvotes: 5
Views: 754
Reputation: 1063413
That hybrid setup isn't a scenario that protobuf-net supports directly, and it isn't something I see as a core use-case, but your surrogate approach should work without any major problems (as long as the assemblies stay in sync).
Upvotes: 1