Reputation: 14820
I serialize some big Dictionary
to disk by protobuf-net, and the size of the file on disk is 450MB.
Often, the code failed with OutOfMemoryException
System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.
at ProtoBuf.BufferPool.ResizeAndFlushLeft(Byte[]& buffer, Int32 toFitAtLeastBytes, Int32 copyFromIndex, Int32 copyBytes) in c:\Dev\protobuf-net\protobuf-net\BufferPool.cs:line 60
at ProtoBuf.ProtoWriter.StartSubItem(Object instance, ProtoWriter writer, Boolean allowFixed) in c:\Dev\protobuf-net\protobuf-net\ProtoWriter.cs:line 341
at ProtoBuf.ProtoWriter.WriteObject(Object value, Int32 key, ProtoWriter writer) in c:\Dev\protobuf-net\protobuf-net\ProtoWriter.cs:line 44
at proto_15(Object , ProtoWriter )
at ProtoBuf.ProtoWriter.WriteObject(Object value, Int32 key, ProtoWriter writer, PrefixStyle style, Int32 fieldNumber) in c:\Dev\protobuf-net\protobuf-net\ProtoWriter.cs:line 116
at ProtoBuf.Meta.TypeModel.SerializeWithLengthPrefix(Stream dest, Object value, Type type, PrefixStyle style, Int32 fieldNumber, SerializationContext context) in c:\Dev\protobuf-net\protobuf-net\Meta\TypeModel.cs:line 548
at ProtoBuf.Meta.TypeModel.SerializeWithLengthPrefix(Stream dest, Object value, Type type, PrefixStyle style, Int32 fieldNumber) in c:\Dev\protobuf-net\protobuf-net\Meta\TypeModel.cs:line 515
at ProtoBuf.Serializer.SerializeWithLengthPrefix[T](Stream destination, T instance, PrefixStyle style, Int32 fieldNumber) in c:\Dev\protobuf-net\protobuf-net\Serializer.cs:line 352
The machine has 20GB+ free memory.
And my process only takes up 4.6GB
And the app.config is tuned for big memory
<runtime>
<gcAllowVeryLargeObjects enabled="true" />
<gcServer enabled="true"/>
</runtime>
Currently I use a try-catch
to catch the exception and re-try after several seconds. And they usually works.
Is there a way to avoid this exception?
Upvotes: 2
Views: 1363
Reputation: 1603
We had the same issue on our application that tried to pass 3mln records from service side to WPF client via WCF.
The solution is to rebuild service side where serialization was made into 64bit application (not AnyCPU but x64) and make sure that APPPool is running in 64bit mode ("Allow 32 bit applications" set to False).
Enjoy!
Upvotes: 0
Reputation: 1063338
gcAllowVeryLargeObjects doesn't help much with byte buffers, since the element number limit still applies - however, it is curious that it is breaking at 450MB; don't get me wrong - that's pretty big. Are you serializing to a file? network? or just in-memory? If it isn't pure in-memory, there may be ways to convince it to require less internal buffering - for example, switching to groups rather than sub-messages (this just requires some attribute changes). For example: this
public class Foo {
[ProtoMember(1)]
public List<Bar> Bars {get; set;}
}
will need to buffer the Bar
data in memory, where-as:
public class Foo {
[ProtoMember(1, DataFormat = DataFormat.Group)]
public List<Bar> Bars {get; set;}
}
will not.
Upvotes: 2