Reputation: 918
I have to copy over 1 Million rows from one table to an other. Because we have over 50 database to update I use a small php program to initiate the transaction per database. But while i am just testing on one database i notice that if i use a "Insert into Select from" statement the server only copy around 10000 lines and then suddenly stops?
We are doing an overhaul of a database and we created a transaction with several update and insert querys to do the job, all other querys in the transaction are finished. but the bulk insert not??? Even that after the bulk insert are still some other tsql actions??
I use following tsql script as part of a transaction:
INSERT INTO [dbo].[tbl_SystemStatistics] ( 40 column)
SELECT 40 column
FROM [dbo].[tbl_SystemStatisticsOverhaul]
Upvotes: 2
Views: 395
Reputation: 121922
Try something like this -
SET NOCOUNT ON;
DECLARE
@x INT
, @count INT
SELECT
@count = COUNT(1)
, @x = 1
FROM dbo.tbl_SystemStatisticsOverhaul
IF EXISTS(
SELECT 1
FROM tempdb.dbo.sysobjects
WHERE ID = OBJECT_ID('tempdb..#import')
) DROP TABLE #import;
SELECT [column], RowNumber = ROW_NUMBER() OVER (ORDER BY id)
INTO #import
FROM dbo.tbl_SystemStatisticsOverhaul
WHILE @x < @count BEGIN
INSERT INTO dbo.tbl_SystemStatistics ([column])
SELECT [column]
FROM #import
WHERE RowNumber BETWEEN @x AND @x + 9
SELECT @x = @x + 10
END
Upvotes: 1
Reputation: 696
Is your max_execution_time in php reach? Default is 30sec I think. http://php.net/manual/en/function.set-time-limit.php
Upvotes: 1