Reputation: 19423
I'm doing a massive import, and only doing a .SubmitChanges() only 1,000 records.
Example:
var targetRecord = new Data.User() { FirstName = sourceRecord.FirstName };
db.Users.InsertOnSubmit(record);
The above is in a loop, for each record from the source database. Then, later...
if (i % 1000 == 0) { db.SubmitChanges(); }
The problem is, the collection of items to be inserted keeps getting bigger and bigger, when I want to clear them out after each SubmitChanges();
What I'm looking for:
if (i % 1000 == 0) { db.SubmitChanges(); db.Dispose_InsertOnSubmit_Records(); }
Something like that. I could alternatively have a list of data records stored in a local variable that I continually re-instantiate after submitting changes, but, that's more code.
Hopefully this makes sense. Thanks!
Upvotes: 2
Views: 645
Reputation: 8920
Well, massive and linq-to-sql do not go together, I'm afraid. It is just not made for batch processing.
If what you are doing is just a straight import (and your example is indicating that) you are much better of with using SqlBulkCopy. That is magnitudes faster. Also more code but if you are looking for speed there is no better solution.
Upvotes: 0
Reputation: 16441
You can intialize a new DataContext
after each SubmitChanges
. I'm not sure of the performance implications, but I've done something similar in the past without any problems.
The only other solution I've seen is iterating through your changes and reverting them. It seems like the former would be a much more efficient method.
Upvotes: 1