Reputation: 872
I have a class that is deserialized using JsonConvert.DeserializeObject. In that object I already had the following:
public string Name { get; private set; }
public uint TotalFrames { get; private set; }
public string AttachedDetachBoneId { get; private set; }
public string StartingBone { get; set; }
public uint SyncFrame { get; private set; }
public uint CycleFrames { get; private set; }
public uint AttachDetachFrame { get; private set; }
These all get successfully deserialized from string data read from a file. I then added the following:
public bool ForceAttachedObjectToGround { get; private set; } = true;
This does not get deserialized. I tested a few different cases and got the following results:
Based on this, I thought perhaps the property type was somehow the issue and adding three new test properties of different types:
public bool BoolTest { get; private set; }
public uint UintTest { get; private set; }
public string StrTest { get; private set; }
These also did not get deserialized, unless I made the setters public or added the JsonProperty
. And if I then remove the attribute/make the properties private again, deserialization once again stops working.
What is happening here? How is Json.Net deciding what to deserialize and what not to?
For completeness sake, I do specify my own JsonSerializerSettings. I have
TypeNameHandling = TypeNameHandling.Objects
, and I specify a custom binder. The binder is not used in this case though.
Upvotes: 0
Views: 93
Reputation: 872
In this case, the (incredibly obvious) answer was that the serialization behaviour was inherited from an interface the class was implementing.:
public interface IAnimationData
{
[JsonProperty]
string Name { get; }
[JsonProperty]
uint TotalFrames { get; }
// etc..
}
public class AnimationData :
IAnimationData
{
public string Name { get; private set; }
public uint TotalFrames { get; private set; }
// etc
}
Upvotes: 0