Lazytitan
Lazytitan

Reputation: 77

Backing up a table before deleting all the records and reloading it in SSIS

But, what is the best approach here? What are the best practices for this issue?

Upvotes: 0

Views: 2582

Answers (3)

KRM
KRM

Reputation: 1405

For backing up your table, instead of loading data from one table (Original) to another table (Backup), you can just rename your original table to something (back-up table), create original table again like the back-up table and then drop the renamed table only when your data load is successful. This may save some time to transfer data from one table to another. You may want to test which approach is faster for you depending on your data/table structure etc., But what I wanted to mention is, this is also one of the way to do it. If you have lot of data in that table below approach may be faster.

sp_rename 'abcTbl', 'abcTbl_bkp';

CREATE TABLE abcTbl ;

While creating this table, you can keep similar table structure as that of abcTbl_bkp

Load your new data to abcTbl table

DROP TABLE abcTbl_bkp;

Upvotes: 1

userfl89
userfl89

Reputation: 4790

Instead of created additional tables you can set the package to execute as a single transaction. By doing this, if any component fails all the tasks that have already executed will be rolled back and subsequent ones will not run. To do this, set the TransactionOption to Required on the package. This will allow that the package will begin a transaction. After this set all this property to Supported for all components that you want to succeed or fail together. The Supported level will have these tasks join a transaction that is already in progress by the parent container, being the package in this case. If there are other components in the package that you want to commit or rollback independent of these tasks you can place the related objects in a Sequence container, and apply the Required level to the Sequence instead. An important thing to note is that if anything performs a TRUNCATE then all other components that access the truncated object will need to have the ValidateExternalMetadata option set to false to avoid the known blocking issue that is a result of this.

Upvotes: 0

Shooter McGavin
Shooter McGavin

Reputation: 177

Trying to figure this out but I think what you are asking for is a method to capture the older data before loading the new data. I would agree with your DBA's that a seperate table for every reload would be extremely messy and not very usable if you ever need it.
Instead, create a table that copies your load table but adds a single DateTime field(say history_date). Each load you would just flow all the data in your primary table to the backup table. Use a Derived Column task in the Data Flow to add the history_date value to the backup table.
Once the backup table is complete, either truncate or delete the contents of the current table. Then load the new data.

Upvotes: 0

Related Questions