Reputation: 1525
I am writing a query to insert 300k rows to a table, i get a 404 File or Directory not found error during the execution of insertion. i am a noob to azure so i may have written some stupid code
I create the DynamicTableEntity objects here :
Test t = new Test();
t.CreateTable("table2");
Dictionary<string, EntityProperty> dic;
Random random = new Random();
ArrayList part1 = new ArrayList();
ArrayList part2 = new ArrayList();
ArrayList part3 = new ArrayList();
ArrayList part4 = new ArrayList();
ArrayList part5 = new ArrayList();
ArrayList part6 = new ArrayList();
DynamicTableEntity dte;
Stopwatch sw = new Stopwatch();
Guid guid;
sw.Start();
for (int i = 0; i < 300000; i++)
{
dic = new Dictionary<string, EntityProperty>();
dic.Add("count", new EntityProperty(i));
dic.Add("rand1", new EntityProperty(random.Next()));
dic.Add("rand2", new EntityProperty(random.Next()));
dic.Add("rand3", new EntityProperty(random.Next()));
if(i!=5)
dic.Add("rand4", new EntityProperty(random.Next()));
else
dic.Add("rand4", new EntityProperty(1234561));
if (i % 2 == 0)
dic.Add("randInt", new EntityProperty(random.Next()));
else
dic.Add("String", new EntityProperty("This is a string" + i));
guid = Guid.NewGuid();
dte = new DynamicTableEntity("0" + (i % 3), DateTime.UtcNow.Ticks.ToString() + guid + i);
dte.Properties = dic;
if (i % 6 == 0)
part1.Add(dte);
else if (i % 6 == 1)
part2.Add(dte);
else if(i%6==2)
part3.Add(dte);
else if (i % 6 == 3)
part4.Add(dte);
else if (i % 6 == 4)
part5.Add(dte);
else if (i % 6 == 5)
part6.Add(dte);
if( (i + 1) % 600 == 0)
{
t.Insert("table2", part1, part2, part3, part4, part5, part6);
part1.RemoveRange(0,100 );
part2.RemoveRange(0, 100);
part3.RemoveRange(0, 100);
part4.RemoveRange(0, 100);
part5.RemoveRange(0, 100);
part6.RemoveRange(0, 100);
}
}
And Here is the insertion execution code :
CloudTable table = _tableClient.GetTableReference(tableName);
TableBatchOperation tablebatchoperation;
int i = 0, j = 0;
if (created == false)
table.CreateIfNotExists();
foreach (ArrayList k in l)
{
tablebatchoperation = new TableBatchOperation();
foreach (DynamicTableEntity dte in k)
{
tablebatchoperation.Insert(dte);
}
table.ExecuteBatch(tablebatchoperation);
}
I appreciate your help
Upvotes: 0
Views: 1514
Reputation: 24895
A possible issue I see here is that you're executing a batch operation which contains more than 100 items. Try something like this (the code is not optimized, but you'll get the picture):
foreach (ArrayList k in l)
{
foreach (DynamicTableEntity dte in k)
{
if (tablebatchoperation == null)
tablebatchoperation = new TableBatchOperation();
tablebatchoperation.Insert(dte);
if (tablebatchoperation.Count == 100)
{
table.ExecuteBatch(tablebatchoperation);
tablebatchoperation = new TableBatchOperation();
}
}
}
if (tablebatchoperation.Count > 0)
table.ExecuteBatch(tablebatchoperation);
Also keep in mind that all entities in a single batch operation should be part of the same partition. Where are you setting the partition and rowkey?
Upvotes: 1
Reputation: 894
If you take a look at the following question you might find what you need. It is also about achieving a great performance on inserts : How to achieve more than 10 inserts per second with Azure Table Storage
Upvotes: 0