Reputation: 16851
I have a list which consists of around 1,000,000 records. And, I am using EF to insert it into the database. Since it takes too long I need to split this List into blocks of 100 records and save.
How can I do this?
await dbCont.People.AddAsync(peoples);
await dbCont.Savechanges();
The people's List here contains 1,000,000 records. And instead of saving all at once (slows the insert), I need to insert around 100 records at a time. How can I do this?
Upvotes: 3
Views: 5498
Reputation: 16
Below is to split in 500 chunks, we can use any number to chunk of 100 or 1000, instead of 500. Just replace and get going. Happy Coding!!
C# Code:
public string OrdersListsplitAndProcess(List<OrderInfo> OrigOrdersList)
{
int rowscount = 0, totCount = OrigOrdersList.Count;
int remainder = (totCount % 500); int FinalQuotient = Convert.ToInt32(Math.Floor(Convert.ToDecimal(totCount / 500)));
int thisQuotient = 0;
List<OrderInfo> thisOrdersList = new List<OrderInfo>();
for (int i = 0; i < totCount; i++)
{
thisOrdersList.Add(OrigOrdersList[i]);
rowscount++;
if (rowscount == 500 || (thisQuotient == FinalQuotient && rowscount == remainder))
{
processOpenOrders(thisOrdersList);// CHUNCKED LIST
thisOrdersList.Clear(); // TO RESET and PREPARE FOR NEXT CHUNK
if (rowscount != remainder)
thisQuotient++;
rowscount = 0;
}
}
return "finished successfully";
} ```
Upvotes: 0
Reputation: 762
I have tried to insert chunks in EF in past, but it is slow because of entity tracking. Of course you can disable entity tracking, but better use SqlBulkCopy to bulk insert. Here is a good article Bulk inserting data into SQL Server.
Upvotes: 0
Reputation: 43553
You could use the extension method Batch of the library MoreLinq: Batches the source sequence into sized buckets.
var list = new List<int>(Enumerable.Range(1, 25));
var buckets = list.Batch(size: 10);
foreach (var bucket in buckets)
{
Console.WriteLine(String.Join(", ", bucket));
}
Output
1, 2, 3, 4, 5, 6, 7, 8, 9, 10
11, 12, 13, 14, 15, 16, 17, 18, 19, 20
21, 22, 23, 24, 25
Upvotes: 1
Reputation: 66
You can do it like this:
var batchSize = 100;
var processed = 0;
var hasNextBatch = true;
while(hasNextBatch)
{
var batch = peoples.Skip(processed).Take(batch).ToList();
await dbCont.People.AddAsync(batch);
await dbCont.Savechanges();
processed += batch.count;
hasNextBatch = batch.Count == batchSize;
}
but you will have to consider that when a batch insert fails, the previous batches are already committed.
Upvotes: 1