Allan F
Allan F

Reputation: 2298

Best method to move a large SQL Server table from one database to another?

I have a table with some 5 Million rows and above 400 columns in a test level database.

I want to temporarily hold a copy of this table in the production database.

I know I can use Tasks and generate script and can choose to script the schema item and the data. This seems to take a long long time.

I could use the Tasks and generate script option to just copy table definition and process the data copy via another method.

Can I use an insert ... select .. across multiple servers with SQL server..?

I could add the test DB and server as a linked server.

I could use SSIS.

Am just asking the general forum for a recommendation that is a best (or better) approach i.e. that is faster than the generate scripts option

I am trying the Generate Scripts option but this has been running all afternoon.

I am hoping for an approach which will run in under an hour or two.

Upvotes: 5

Views: 8005

Answers (3)

Hadi
Hadi

Reputation: 37313

In a similar case, i will use SSIS since it gives you more control over the transfer operation. There are many tips and tricks that can be done to achieve higher performance such as importing data in chunks, configure buffer size, use bulk insert ...

I have posted many SSIS related answers that you can refer to in order to obtain more information:

Upvotes: 1

Gerard
Gerard

Reputation: 301

If you are on SQL server I would tend to use the Import Export Wizard as the quick and easy method. It will fall over gracefully if there are issues.

1) Create the table in the destination database

2) Right click on the destination database and then Tasks-> Import Data

3) Connect to the Source server when prompted and then keep following the prompts

Hope it helps

Upvotes: 4

SchmitzIT
SchmitzIT

Reputation: 9552

One option is to right-click the database, then select Tasks -> Export data.

However, if you know SSIS, I'd think it's the easiest way to go, as moving data is exactly what it's intended for. You can increase the batch size there to beter suit your data volume.

As for how long it takes, that really depends on your hardware. The biggest bottlenecks most likely are your network and disks.

Upvotes: 3

Related Questions