willvv
willvv

Reputation: 8649

Take advantage of multiple cores executing SQL statements

I have a small application that reads XML files and inserts the information on a SQL DB.

There are ~ 300 000 files to import, each one with ~ 1000 records.

I started the application on 20% of the files and it has been running for 18 hours now, I hope I can improve this time for the rest of the files.

I'm not using a multi-thread approach, but since the computer I'm running the process on has 4 cores I was thinking on doing it to get some improvement on the performance (although I guess the main problem is the I/O and not only the processing).

I was thinking on using the BeginExecutingNonQuery() method on the SqlCommand object I create for each insertion, but I don't know if I should limit the max amount of simultaneous threads (nor I know how to do it).

What's your advice to get the best CPU utilization?

Thanks

Upvotes: 1

Views: 303

Answers (3)

Joel Coehoorn
Joel Coehoorn

Reputation: 416049

Look into bulk insert.

Imports a data file into a database table or view in a user-specified format.

Upvotes: 0

innaM
innaM

Reputation: 47869

If I understand you correctly, you are reading those files on the same machine that runs the database. Although I don't know much about your machine, I bet that your bottleneck is disk IO. This doesn't sound terribly computation intensive to me.

Upvotes: 2

Harper Shelby
Harper Shelby

Reputation: 16583

Have you tried using SqlBulkCopy? Basically, you load your data into a DataTable instance, then use the SqlBulkCopy class to load it to SQL Server. Should offer a HUGE performance increase without as much change to your current process as using bcp or another utility.

Upvotes: 0

Related Questions