Reputation: 633
I need to run some algorithms over data massive number of times and store each result in database.
Number of algorithm runs is 80,000-90,000, each cycle takes about 2 seconds (just the algorithm ). So it's very time consuming.
My database SQL server 2008.
I want to use ado.net entity framework (is it good for this task it's not good?)
Right now the output data (that needs to be stored in DB) is plain raw (not very big), plus some maintain columns like date and time.
What is the best practice for that?
Insert row by row, as each algorithms completes? store the results in memory and after work is finished insert the data?
Upvotes: 1
Views: 185
Reputation: 1
In case you didn't use sqlbulkcopy, you could do next:
Upvotes: 0
Reputation: 706
You could use SqlBulkCopy class and use a DataTable as source data. It's realy fast compared with multiple INSERTs.
Upvotes: 1
Reputation: 27350
Could you not try BulkInsert after running your algorithm against all records first? It is very efficient at getting the data into the database.
http://msdn.microsoft.com/en-us/library/ms188365.aspx
Upvotes: 3