Reputation: 33990
I'm looking to store arrays in Azure Table entities. At present, the only type of array supported natively is byte-array, limited to 64k length. The size is enough, but I'd like to store arrays of longs, doubles and timestamps in an entity.
I can obviously cast multiple bytes to the requested type myself, but I was wondering if there's any best-practice to achieve that.
To clarify, these are fixed length arrays (e.g. 1000 cells) associated with a single key.
Upvotes: 3
Views: 5139
Reputation: 1749
I have written a Azure table storage client, called Lucifure Stash, which supports arrays, enums, large data, serialization, public and private properties and fields and more.
You can get it at https://github.com/hocho/LucifureStash
Upvotes: 4
Reputation: 2066
You could serialize your array as a JSON string using the .NET JavaScript serializer: http://msdn.microsoft.com/en-us/library/system.web.script.serialization.javascriptserializer.aspx
This class has a "MaxJsonLength" property you could use to ensure your arrays didn't exceed 64K when you were serializing them. And you can use the same class to deserialize your stored objects.
Upvotes: 0
Reputation: 21
If you choose to store your object in blob storage and need more than one "key" to get it, you can just create an azure table or two or n where you store the key you want to look up and the reference to the exact blob item.
Upvotes: 2
Reputation: 9399
I've been trying to think of a nice way to do this other than the method you've already mentioned, and I'm at a loss. The simplest solution I can come up with is to take the array, binary serialize it and store in a binary array property.
Other options I've come up with but dismissed:
Upvotes: 3
Reputation: 23552
If you have just a key-value collection to store, then you can also check out Azure BLOBs. They can rather efficiently store arrays of up to 25M time-value points per single blob (with a random access within the dataset).
Upvotes: 3