jullin
jullin

Reputation: 633

Inserting massive number of records into database

I need to run some algorithms over data massive number of times and store each result in database.

Number of algorithm runs is 80,000-90,000, each cycle takes about 2 seconds (just the algorithm ). So it's very time consuming.

My database SQL server 2008. I want to use ado.net entity framework (is it good for this task it's not good?)
Right now the output data (that needs to be stored in DB) is plain raw (not very big), plus some maintain columns like date and time.

What is the best practice for that?
Insert row by row, as each algorithms completes? store the results in memory and after work is finished insert the data?

Upvotes: 1

Views: 185

Answers (3)

Setämies
Setämies

Reputation: 1

In case you didn't use sqlbulkcopy, you could do next:

  1. Store data to local variables
  2. After all data is collected, begin SQL transaction and insert each row to db. After all insert querys are executed, commit.

Upvotes: 0

cdel
cdel

Reputation: 706

You could use SqlBulkCopy class and use a DataTable as source data. It's realy fast compared with multiple INSERTs.

Upvotes: 1

jaffa
jaffa

Reputation: 27350

Could you not try BulkInsert after running your algorithm against all records first? It is very efficient at getting the data into the database.

http://msdn.microsoft.com/en-us/library/ms188365.aspx

Upvotes: 3

Related Questions