Reputation: 7580
I have a class Bar
which contains a List<Foo>
, with both Foo
and Bar
implementing ISerializable.
When deserializing a Bar
, the List<Foo>
is initially filled with (the correct number of) null
s; then on exiting the Bar
deserialization ctor, each Foo
's deserialization ctor is called, filling the List<Foo>
with the (correctly deserialized) Foo
s.
Why is this happening? I can't replicate it in a test project: whatever I have tried has resulted in the Foo
deserialization ctors being called before the Bar
ctor. This is actually the behaviour I would like, as I need the list to be filled in order to do some initialization for the deserialized Bar
!
Anyone have an idea as to what could be causing the Foo
s to be deserialized so late? Thanks!
Upvotes: 10
Views: 1128
Reputation: 62127
It is logic. The deserializer deserialized it object by object, then following references. So, first it sets up the List with X spaces... which actually all are NULL.
Then it goes in and deserializes object by object, putting them into the proper references.
All check etc. logic from you should ONLY run AFTER deserialization has completed - per definition you have to always have partial / invalid states while the deserializer runs.
The issue why things are done that late is proabably that your test scenario is a lot easier than the real data, so something makes the serializer "turn the order" on the production side.
Upvotes: 6