nightcoder
nightcoder

Reputation: 13519

Fastest way to convert a very large SQL Server table

We are redesigning a very large (~100Gb, partitioned) table in SQL Server 2012.

For that we need to convert data from the old (existing) table into the newly designed table on the production server. The new table is also partitioned. Rows are only appended to the table.

The problem is a lot of users work on this server, and we can do this conversion process only in chunks and when the server is not under heavy load (a couple of hours a day).

I wonder if there is a better & faster way?

This time we will finish the conversion process in a few days (and then switch our application to use the new table), but what would we do if the table was 1Tb? Or 10Tb?

PS. More details on the current process:
The tables are partitioned based on the CloseOfBusinessDate column (DATE). Currently we run this query when the server is under low load:

INSERT INTO
  NewTable
...
SELECT ... FROM
  OldTable -- this SELECT involves xml parsing and CROSS APPLY
WHERE
  CloseOfBusinessDate = @currentlyMigratingDate

Every day about 1M rows from the old table gets converted into 200M rows in the new table.
When we finish the conversion process we will simply update our application to use NewTable.

Upvotes: 0

Views: 250

Answers (1)

nightcoder
nightcoder

Reputation: 13519

Everybody, who took time to read the question and tried to help me, I'm sorry, I didn't have enough details myself. Turns out the query that selects data from the old table and converts it, is VERY slow (thanks to @Martin Smith I've decided to check the SELECT query). The query involves parsing xml & uses cross apply. I think the better way in our case would be to write a small application that would simply load data from the old table for each day, convert it in memory and then use Bulk Copy to insert into the new table.

Upvotes: 1

Related Questions