SuppaiKamo
SuppaiKamo

Reputation: 285

How to prevent a bulk insert from blocking the entire database (other tables than the affected table) in SQL Server?

I am using bulk inserts to insert a collection of large files (up to 100 million records) into a database replacing existing data.

Initially this took about 5 minutes, which is a bit too slow since it locks production tables during update (for the purposes 2-3 minute wait should be max). In order to speed up things I created a staging table from which I copy data internally on the SQL Server (which is about 5 times faster when testing). The thought was that at least the production tables were locked for a shorter time. However now it seems that the entire database is locked during the update process, even tables not in use in the update process.

Is this normal behavior? and how can this be prevented? I have googled a lot and all issues are related to the locking of tables being updated and not unrelated tables.

For a single table this it the bulk insert:

truncate table SomeSchema.TradesStaging
BULK INSERT SomeSchema.TradesStaging
FROM '\\SomePath\SomeSchema.Trades.tab'
WITH
(
    FIRSTROW = 2,
    FIELDTERMINATOR = '\t',  
    ROWTERMINATOR = '\n',
    TABLOCK
)           

And the internal copying looks like this:

if EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_SCHEMA = 'SomeSchema' AND  TABLE_NAME = 'Trades') begin drop table SomeSchema.Trades end 
select * into SomeSchema.Trades from SomeSchema.TradesStaging

--a lot of code recreating indexes etc. down here.

All code is wrapped into a stored procedure like this (example with only the above):

SomeSchema.StageTradesFromFile
SomeSchema.LoadTradesFromStaging

Upvotes: 0

Views: 1293

Answers (1)

DB101
DB101

Reputation: 633

Are you sure you need to drop the SomeSchema.Trades table? This will result in a schema lock. Why not just truncate the table and reload it? In addition you could disable and reenable the indexes to help speed it up.

Upvotes: 1

Related Questions