Debashis
Debashis

Reputation: 91

How do I optimize data import into SQL Server with 100000 records with 100s of concurrent users

I am using Visual Studio web application using ASP.NET MVC & C#. I have an Excel file of more than 100000 records for import into SQL Server 2008 R2.

I have used SqlBulkCopy with ado.net DataTable to copy data to a table (Table1).

Then validating and processing data and marking with valid/invalid.

Then adding and updating records to 3 tables (Table2, Table3 and Table4) from Table1 using a stored procedure.

Table4 contains 2 triggers and it is using cursor to update several records related to it inside Table4 and another table (Table2).

When testing it is taking 1 minute for 10000 records from my local machine with debug mode.

If 10 or 100 users imports at same time, would it support?

Is there any better process to improve and supports for many concurrent users?

My computer configuration:

When I test it with 5 concurrent user, 2 or 3 process successfully executed and I get this error:

Transaction (Process ID 66) was deadlocked on lock | communication buffer resources with another process and has been chosen as the deadlock victim. Rerun the transaction.

Upvotes: 3

Views: 947

Answers (4)

user8432352
user8432352

Reputation:

Add no lock to queries so you will not block operations.

Upvotes: 0

Amit Kumar Singh
Amit Kumar Singh

Reputation: 4475

  1. Can you disable (or maybe, drop and re-create later) indexes during import? Indexes fasten the select but slow down Insert and update for large tables? Use MERGE instead of INSERT and UPDATE.

  2. Find an alternative for cursor (like CTE or while loop or MERGE statement) or use FAST_FORWARD cursor, if not already using.

  3. How are you validating data. Is that step fast enough?

  4. Do you need triggers during Bulk processing. Are these tables being used from somewhere. Code of trigger which will execute per row, can you execute that in a batch in bulk import process. There is an option to DISABLE triggers in Sql. Do check if that suits your purpose.

  5. SSIS packages also can import data and do such stuff.

  6. You can create table2_temp, table3_temp,table4_temp (may be per user dynamically and drop them later). Do your processing separately and MERGE into main table2, table3, table4, if the triggers in table4 do not work on pre-existing data.

Upvotes: 1

Janusz Nowak
Janusz Nowak

Reputation: 2848

Insert or import this records to new or temporary tables and later or copy/move to destination tables.

Upvotes: 0

Moien Tajik
Moien Tajik

Reputation: 2321

Take a look & try Zzz Project, you can bulk insert a million records in few seconds. It supports Entity Framework, Dapper. ZZZProject

Upvotes: 1

Related Questions