Reputation: 811
As the question stated I have a text file (700 mb) that I am reading using c#,I am parsing the 2.5 million lines, converting each line into a class, serializing the class, then inserting into a sql-server-2012 database.
The table I am inserting into has two columns and looks like:
{Auto_Increment_id: Serialized Byte Array}
My current strategy is to parse about 10000 lines, insert them into the database, and then repeat. This is taking about 3 hours to do, so I am sure there is a more efficient way.
One thought I had would be to write the inserts to a text file and do a bulk copy into the database. Any other thoughts?
Ultimately I want to get this process down to at least 10 - 20 minutes. Is this possible?
Upvotes: 1
Views: 965
Reputation: 62093
SqlBulkCopy. Read about it. IN the documentation.
FASTER - because it is not really written smart - is to make this into a temp table, then at the end of that insert into the final table. SqlBulkCopy locks the whole table, this bypasses it and allows the table to be used during the upload.
Then use multiple threads to insert blocks of a lot more than 10000 rows per go.
I manage more than 100.000 rows - per second - on a lower end database server (that is 48gb memory, about a dozen SAS discs - and yes, that is lower end).
Upvotes: 4