Reputation: 709
Serializing/deserializing with BinaryFormatter, resulting serialized file is ~80MB in size. The deserialization takes a few minutes. How could I improve on this? Here's the deserialization code:
public static Universe DeserializeFromFile(string filepath)
{
Universe universe = null;
FileStream fs = new FileStream(filepath, FileMode.Open);
BinaryFormatter bf = new BinaryFormatter();
try
{
universe = (Universe)bf.Deserialize(fs);
}
catch (SerializationException e)
{
Console.WriteLine("Failed to deserialize. Reason: " + e.Message);
throw;
}
finally
{
fs.Close();
}
return universe;
}
Maybe read all to memory prior to deserializing or use some other serialization technique?
Upvotes: 7
Views: 7210
Reputation: 458
I know this is an old question, but stumbled upon a solution that improved my deserialization speed substantially. This is useful if you have large sets of data.
Upgrade your target framework to 4.7.1+ and enable the following switch in your app.config.
<runtime>
<!-- Use this switch to make BinaryFormatter fast with large object graphs starting with .NET 4.7.2 -->
<AppContextSwitchOverrides value="Switch.System.Runtime.Serialization.UseNewMaxArraySize=true" />
</runtime>
Sources: BinaryFormatter AppContextSwitchOverrides
Upvotes: 2
Reputation: 1062550
How complex is the data? If it is an object tree (rather than a full graph), then you might get some interesting results from trying protobuf-net. It is generally pretty easy to fit onto existing classes, and is generally much smaller, faster, and less brittle (you can change the object model without trashing the data).
Disclosure: I'm the author, so might be biased - but it really isn't terrible... I'd happily lend some* time to help you try it, though.
*=within reason
Upvotes: 0
Reputation: 4445
Try reading the file into a memory stream first in one go, then deserialize using the memory stream.
Upvotes: 0