Reputation: 6037
I have a weirdly-made JSON that I need to deserialize.
It looks like this:
{
"Properties": [
{ "A": "aaa", "B": "bbb", "C": "ccc" },
{ "X": "xxx", "Y": "yyy" },
{ "many other": "items" }
],
"Products": [
{ "Id": 0, "PropertiesIndexes": [ 0, 1 ] },
{ "Id": 1, "PropertiesIndexes": [ 0, 1, 2 ] }
]
}
The properties array can have objects with any number and name of keys, and each product accesses the properties array through an index.
We need to store these JSON files in MongoDB, and because they can be quite huge (I'm talking several hundred Mb for a file) I need to split them (Mongo has a 16Mb limit). So, each property is a Mongo document, and each product is a Mongo document.
Because I can't just get properties in the order of insertion, we have decided to save all the properties of each product.
Here are the classes used to deserialize this data (using JSON.Net):
public class Entity {
public OrderedDictionary Properties { get; set; }
public IEnumerable<Product> Products { get; set; }
}
public class Product {
[JsonIgnore]
public Entity Entity { get; set; }
publid int Id { get; set; }
public int[] PropertiesIndexes { get; set; }
}
Now, I need to access the actual properties data directly from the product, so that a (hypothetical) resulting JSON would be like this:
{
"Products": [
{ "Id": 0,
"PropertiesData": [
{ "A": "aaa", "B": "bbb", "C": "ccc" },
{ "X": "xxx", "Y": "yyy" }
]
},
{ "Id": 1,
"PropertiesData": [
{ "A": "aaa", "B": "bbb", "C": "ccc" },
{ "X": "xxx", "Y": "yyy" },
{ "many other": "items" }
]
}
]
}
My naïve implementation is this:
// in Product
[JsonIgnore]
public IDictionary<string, object> PropertiesData {
get {
if (this.Entity != null && this.Entity.Properties != null) {
var data = new Dictionary<string, object>();
for (int i = 0; i < this.PropertiesIndexes.Length; i++) {
data.Add(
this.Entity.Properties.Cast<DictionaryEntry>().ElementAt(this.PropertiesIndexes[i]).Key.ToString(),
this.Entity.Properties[this.PropertiesIndexes[i]]);
}
return data;
}
return new Dictionary<string, object>();
}
}
but it's slow (as I said, I have huge amounts of data) and very ugly.
Back when we were using ExpandoObject
s I had a nice, fast and memory-efficient yeilding method:
private IEnumerable<ExpandoObject> getPropertiesDataFromEntity() {
for (int i = 0; i < this.PropertiesIndexes.Count; i++) {
yield return this.Entity.Properties[this.PropertiesIndexes[i]];
}
}
but ExpandoObject
have problems of their own in the context of our app: they are not always properly saved in Mongo, and they are not serialized by WCF without a custom serializer.
Upvotes: 0
Views: 107
Reputation: 82474
I've written something quite simple to get from your source JSON to your target JSON.
It includes 4 very simple classes.
The first two are designed to be deserialized from your source JSON:
public class Source
{
public Dictionary<string, string>[] Properties { get; set; }
public SourceProduct[] Products { get; set; }
}
public class SourceProduct
{
public int Id { get; set; }
public int[] PropertiesIndexes { get; set; }
}
The third class is the target product:
public class Product
{
private List<Dictionary<string, string>> _propertiesData = new List<Dictionary<string, string>>();
public int Id { get; set; }
public List<Dictionary<string, string>> PropertiesData { get { return _propertiesData; } }
}
And the last class is the final target, where I am translating the data from the source to target:
public class Target
{
private List<Product> _products = new List<Product>();
public IEnumerable<Product> Products { get { return _products; } }
public Target(Source source)
{
foreach(var sourceProduct in source.Products)
{
var product = new Product()
{
Id = sourceProduct.Id
};
foreach(var propertyIndex in sourceProduct.PropertiesIndexes)
{
product.PropertiesData.Add(source.Properties[propertyIndex]);
}
_products.Add(product);
}
}
}
Using these classes your client code simply becomes this:
var targetJson = JsonConvert.SerializeObject
(
new Target(JsonConvert.DeserializeObject<Source>(sourceJson))
);
Upvotes: 1