DigDoug
DigDoug

Reputation: 306

Making structural changes to very large tables in an online environment

So here's what I'm facing.

The Problem

Requirements

Ideas

  1. Put a trigger on existing table, to update row in new table if it already exists there.
  2. Iterate through original table, copying data into new table ~10,000 at a time. Maybe 2,000 of the first part of the old key.
  3. When the copy is complete, rename the old table to "ExistingTableOld" and the new one from "NewTable" to "ExistingTable". This should allow stored procs to continue to run without intervention

Are there any glaring omissions in the plan, or best practices I'm ignoring?

Upvotes: 5

Views: 141

Answers (2)

Nicholas Carey
Nicholas Carey

Reputation: 74315

My experience making big schema changes is big changes are best done during a maintenance window—at night/over a weekend—when users are booted off the system. Just like running dbcc checkdb with the repair option. Then, when things go south, you have the option to roll back to the full backup that you providentially made right before starting the upgrade.

Item #3 on your list: Renaming the old/new tables. You'll probably want to recompile the stored procedures/views. My experience is that execution plans are bound against the object ids rather than object names.

Consider table dbo.foo: if it is renamed to dbo.foo_old, any stored procedures or user-defined functions won't necessarily error out until the dependent object is recompiled and its execution plan rebound. Cached execution plans continue to work perfectly fine.

sp_recompile is your friend.

Upvotes: 1

Tim Rogers
Tim Rogers

Reputation: 21713

Difficult problem. Your plan sounds good, but I'm not totally sure you really need to batch the query as long as you run it in a transaction isolation level of READ UNCOMMITTED to stop locks being generated.

Upvotes: 1

Related Questions