Maximus Decimus
Maximus Decimus

Reputation: 5261

Alter column to be identity

I've already read the following answers about the impossibility to alter a column into identity once has been created as a regular int.

Adding an identity to an existing column

How to Alter a table for Identity Specification is identity SQL Server

How to alter column to identity(1,1)

But the thing is I have a table which has been migrated to a new one where the ID was not declared as identity from the beginning, because the old table which was created with an ID identity a long time ago has missing rows due to a purge of historical data. So as far as I know, if I add a new column as identity on my new table, it will automatically create the column sequentially and I need to preserve the IDs from the old table as-is because there is already data linked to these previous IDs.

How can I do transform my ID column from the new table as identity but not sequentially, but with the IDs from the old table?

Upvotes: 1

Views: 12911

Answers (3)

tobypls
tobypls

Reputation: 849

You could try this approach:

  1. Insert rows with old ID with SET IDENTITY_INSERT <new table> ON. This allows you to insert your own ID.
  2. Reseed the Identity, setting it to the highest ID value +1 with DBCC CHECKIDENT ('<new table>', RESEED, <max ID + 1>). This will allow your Identity to increase from the highest ID and forward.

Something like this in code:

-- Disable auto increment
SET IDENTITY_INSERT <new table> ON
-- <INSERT STUFF HERE>
SET IDENTITY_INSERT <new table> OFF

-- Reseed Identity from max ID
DECLARE @maxval Int
SET @maxval = ISNULL(
(
    SELECT
        MAX(<identity column>) + 1
    FROM <new table>
), 0)
DBCC CHECKIDENT ('<new table>', RESEED, @maxval)

EDIT: This approach requires your ID-column to be an Identity, of course.

Upvotes: 5

user1585204
user1585204

Reputation: 965

I have a process where a temp table is used between a source file, CSV and the production table. The temp table has to match the CSV file columns, there is no PK in this data.

To find a set of rows before and after where the Azure Data Factory was failing, I imported over 2,000,000 rows into a temp table. The process stopped in Azure at 1,500,000 rows.

The error was that an integer or string would be truncated.

This line of code added a PK to the temp table and incremented it:

ALTER TABLE ##FLATFILETEMPBDI ADD ROWNUM INT IDENTITY

That would be the simplest solution to get a row number. I was then able to do this query to find the rows just before and after 1,500,000:

SELECT  
  ROWNUM
  , PARTDESCRIPTION
  , LEN(PARTDESCRIPTION) AS LENDESCR
  , QUANTITY
  , ONORDER
  , PRICE
  , MANUFACTURERPARTNUMBER
FROM ##FLATFILETEMPBDI 
WHERE ROWNUM BETWEEN 1499990 AND 1500005

Works perfectly -- was not planning on it to be that easy, was surprised as anyone to see that the ALTER TABLE with IDENTITY worked to do the numbering for me.

Upvotes: 0

ewahner
ewahner

Reputation: 1189

If you don't have nulls in the field that you want to copy over from your previous version, you could first figure out what the largest ID is by just doing a max(Id) select. Then using SSMS go add your new field and when you set it as identity, just set the SEED value to something higher than what your current max is so you don't have collisions on new inserts.

Upvotes: 0

Related Questions