Reputation: 8009
What are the limits of calling ExecuteQuery()
? For example, limits on the number of entities and download size.
In other words, when will the method below hit its limits?
private static void ExecuteSimpleQuery(CloudTable table, string partitionKey, string startRowKey, string endRowKey)
{
try
{
// Create the range query using the fluid API
TableQuery<CustomerEntity> rangeQuery = new TableQuery<CustomerEntity>().Where(
TableQuery.CombineFilters(
TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, partitionKey),
TableOperators.And,
TableQuery.CombineFilters(
TableQuery.GenerateFilterCondition("RowKey", QueryComparisons.GreaterThanOrEqual, startRowKey),
TableOperators.And,
TableQuery.GenerateFilterCondition("RowKey", QueryComparisons.LessThanOrEqual, endRowKey))));
foreach (CustomerEntity entity in table.ExecuteQuery(rangeQuery))
{
Console.WriteLine("Customer: {0},{1}\t{2}\t{3}", entity.PartitionKey, entity.RowKey, entity.Email, entity.PhoneNumber);
}
}
catch (StorageException e)
{
Console.WriteLine(e.Message);
Console.ReadLine();
throw;
}
}
The method below is using ExecuteQuerySegmentedAsync
with TakeCount of 50, but How the 50 is determined, which is I think determined by the my questions above.
private static async Task PartitionRangeQueryAsync(CloudTable table, string partitionKey, string startRowKey, string endRowKey)
{
try
{
// Create the range query using the fluid API
TableQuery<CustomerEntity> rangeQuery = new TableQuery<CustomerEntity>().Where(
TableQuery.CombineFilters(
TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, partitionKey),
TableOperators.And,
TableQuery.CombineFilters(
TableQuery.GenerateFilterCondition("RowKey", QueryComparisons.GreaterThanOrEqual, startRowKey),
TableOperators.And,
TableQuery.GenerateFilterCondition("RowKey", QueryComparisons.LessThanOrEqual, endRowKey))));
// Request 50 results at a time from the server.
TableContinuationToken token = null;
rangeQuery.TakeCount = 50;
int segmentNumber = 0;
do
{
// Execute the query, passing in the continuation token.
// The first time this method is called, the continuation token is null. If there are more results, the call
// populates the continuation token for use in the next call.
TableQuerySegment<CustomerEntity> segment = await table.ExecuteQuerySegmentedAsync(rangeQuery, token);
// Indicate which segment is being displayed
if (segment.Results.Count > 0)
{
segmentNumber++;
Console.WriteLine();
Console.WriteLine("Segment {0}", segmentNumber);
}
// Save the continuation token for the next call to ExecuteQuerySegmentedAsync
token = segment.ContinuationToken;
// Write out the properties for each entity returned.
foreach (CustomerEntity entity in segment)
{
Console.WriteLine("\t Customer: {0},{1}\t{2}\t{3}", entity.PartitionKey, entity.RowKey, entity.Email, entity.PhoneNumber);
}
Console.WriteLine();
}
while (token != null);
}
catch (StorageException e)
{
Console.WriteLine(e.Message);
Console.ReadLine();
throw;
}
}
Examples are from the link below: https://github.com/Azure-Samples/storage-table-dotnet-getting-started
Upvotes: 1
Views: 3448
Reputation: 136306
For ExecuteQuerySegmentedAsync
, the limit is 1000
. This is based on the limitation posed by REST API where a single request to table service can return a maximum of 1000 entities (Ref: https://learn.microsoft.com/en-us/rest/api/storageservices/query-timeout-and-pagination).
ExecuteQuery
method will try to return all entities matching a query. Internally it tries to fetch a maximum of 1000 entities in a single iteration and will try to fetch next set of entities if the response from table service includes a continuation token.
UPDATE
If ExecuteQuery performs pagination automatically, it seems it is easier to use than ExecuteQuerySegmentedAsync. Why must I use ExecuteQuerySegmentedAsync? What about download size? 1000 entities regardless their sizes?
With ExecuteQuery
, there's no way for you to break out of the loop. This becomes problematic when you have a lot of entities in a table. You have that flexibility with ExecuteQuerySegmentedAsync
. For example, let's assume you want to download all entities from a very large table and save them locally. If you use ExecuteQuerySegmentedAsync
, you can save the entities in different files.
Regarding your comment about 1000 entities regardless of the size, the answer is yes. Please keep in mind that maximum size of each entity can be 1MB.
Upvotes: 3