Reputation: 9420
I couldn't decide the best approach to handle the following scenario via Azure storage.
I would like to use Block Blob but downloading ~400MB single file into the computer, just to add a single line and upload it back doesn't make sense and I couldn't find other way around.
There is a Drive option which uses Page Blob unfortunately it is not supported by SDKv2 and makes me nervous about possible discontinuation of the support
And final one is Table which looks OK other than reading few hundred thousands rows continuesly may become an issue
Basically, I prefer to write files when I retrieve the data immediately. But, if it does worth to give up, I can live with the single update at the end of the day which means ~300-1000 lines per file
What would be best approach to handle this scenario?
Upvotes: 2
Views: 822
Reputation: 20556
Based on your above requirement, Azure Tables are the optimal option. With single Azure Storage account you get the following:
Storage Transactions – Up to 20,000 entities/messages/blobs per second
Single Table Partition – a table partition are all of the entities in a table with the same partition key value, and most tables have many partitions. The throughput target for a single partition is:
Tables – use a more finely grained PartitionKey for the table in order to allow us to automatically spread the table partitions across more servers.
About reading "few hundred thousands rows" continuously, your main obstacle is storage level 20,000 transactions/sec however if you design your partition so granular to segment them on hundreds of servers, you could be able to read "hundred of thousands" in minutes.
Source:
Upvotes: 3