Reputation: 365
we have an system that creates and table in a database on our production server for each day/shift. I would like to somehow grab the data from that server and move it to our archive server and if the data is more than x days old remove it off the production server.
On the production server, the database is called "transformations" and the tables are named "yyyy-mm-dd_shift_table". I would like to move this into a database on another server running SQL 2012 into a database "Archive" with the same name. Each table contains about 30k records for the day.
The way i see it would be something like:
Ideally i would like to have this as a procedure in SQL that can run either daily/hourly ect.
Suggestions on how to attack this would be great.
EDIT: Happy to do a select on all matching tables in the database and write them into a single table on my database.
Upvotes: 1
Views: 2065
Reputation: 365
A lot of digging today and i have come up with the flowing, This will load all the data from the remote server and insert it into the table on the local server. This requires a Linked server on your archive server which you can use to query the remote server. I'm sure you could reverse this and push the data but i didn't want to chew up cycles on the production server.
-- Set up the variables
--Tracer for the loop
DECLARE @i int
--Variable to hold the SQL queries
DECLARE @SQLCode nvarchar(300)
--Variable to hold the number of rows to process
DECLARE @numrows int
--Table to hold the SQL queries with and index for looping
DECLARE @SQLQueries TABLE (
idx smallint Primary Key IDENTITY(1,1)
, SQLCode nvarchar(300)
)
--Set up a table with the SQL queries that will need to be run on the remote server. This section creates an INSERT statment
--which is returning all the records in the remote table that do not exist in the local table.
INSERT INTO @SQLQueries
select 'INSERT INTO Local_Table_Name
SELECT S.* FROM [Remote_ServerName].[Transformations].[dbo].[' + name + '] AS S
LEFT JOIN Local_Table_Name AS T ON (T.Link_Field = S.Link_Field)
WHERE T.Link_Field IS Null'+
CHAR(13) + CHAR(10) + CHAR(13) + CHAR(10)
from [Remote_ServerName].[Transformations].sys.sysobjects
where type = 'U' AND name Like '%_Table_Suffix'
--Set up the loop to process all the tables
SET @i = 1
--Set up the number of rows in the resultant table
SET @numrows = (SELECT COUNT(*) FROM @SQLQueries)
--Only process if there are rows in the database
IF @numrows > 0
--Loop while there are still records to go through
WHILE (@i <= (SELECT MAX(idx) FROM @SQLQueries))
BEGIN
--Load the Code to run into a variable
SET @SQLCode = (SELECT SQLCode FROM @SQLQueries WHERE idx = @i);
--Execute the code
EXEC (@SQLCode)
--Increase the counter
SET @i = @i + 1
END
Initial ran over 45 tables inserted about 1.2 million records took 2.5 min. After that each run took about 1.5 min which only inserted about 50-100 records
Upvotes: 2
Reputation: 3929
I actually created a solution for this and have it posted on GitHub. It uses a library called EzAPI and will sync all the tables and columns from one server to another.
You're welcome to use it, but the basic process works by first checking the metadata between the databases and generating any changed objects. After making the necessary modifications to the destination server, it will generate one SSIS package per object and then execute the package. Can you choose to remove or keep the packages after they are generated.
https://github.com/thevinnie/SyncDatabases
Upvotes: 0