Reputation: 23
I have an Azure table that has over a million entries and I am trying to do about 300,000 queries programmatically in C#
in order to transfer some data to another system. Currently I am doing the following as I read through a file which has the partition and row keys:
while (!reader.EndOfStream)
{
// parse the reader to get partition and row keys
string currentQuery = TableQuery.CombineFilters(TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, partKey), TableOperators.And, TableQuery.GenerateFilterCondition("RowKey", QueryComparisons.Equal, rowKey));
TableQuery<MyEntity> query = new TableQuery<MyEntity>().Where(currentQuery);
foreach (MyEntity entity in table.ExecuteQuery(query))
{
Console.WriteLine(entity.PartitionKey + ", " + entity.RowKey + ", " + entity.Timestamp.DateTime);
}
Thread.Sleep(25);
}
This is taking a very long time to complete(5+ hours). The queries are taking on average around 200 milliseconds from what I can see. I am kinda new to Azure so I figure I am doing something wrong. How can I improve it?
Upvotes: 0
Views: 859
Reputation: 71031
A few things:
Console.Writeline()
in your actual app. If so, this will slow you down as well.ServicePointManager.UseNagleAlgorithm = false;
. Otherwise, individual low-level calls to storage might be buffered up to 500ms, to more densely pack the tcp packets. This will be important if you're spending cycles processing the content you read.Upvotes: 2