Reputation: 29
I currently have two tables Table1 and Table 2 structure as below. As you can see table 1 contains multiple rows for column FK and the FK column makes the foreign key to the Table 2 ID column which has only one row per ID with the most recent value from table 1 ordered by ID column in Table 1.
Table 1
ID FK END_DTTM
1 1 01/01/2000
2 1 01/01/2005
3 1 01/01/2012
4 1 01/01/2100
5 2 01/01/1999
6 2 01/01/2100
7 3 01/01/2100
Table 2
ID END_DTTM
1 01/01/2100
2 01/01/2100
3 01/01/2100
The business requirement is to track every update in Table 2 so that point in time data can be retrieved. To achieve this I am using SQL 2016 and Temporal tables where every update to the Table 2 creates a version in the history table automatically.
To achieve the insert update process I am currently using cursors which is terribly slow and is processing around 71000 rows in 30 mins and the table has around 60million rows! Cursor query as follows:
BEGIN
BEGIN TRY
BEGIN TRANSACTION;
Declare @ID as int;
Declare @FK as int;
Declare @END_DTTM as datetime2;
DECLARE @SelectCursor as CURSOR;
SET @SelectCursor = CURSOR LOCAL STATIC READ_ONLY FORWARD_ONLY FOR
SELECT [ID],[FK],[END_DTTM] from TABLE1 order by FK,ID;
OPEN @SelectCursor ;
FETCH NEXT FROM @SelectCursor INTO @ID,@FK,@END_DTTM;
WHILE @@FETCH_STATUS = 0
BEGIN
UPDATE TABLE2
set
END_DTTM = @END_DTTM
where ID = @FK
IF @@ROWCOUNT = 0
BEGIN
INSERT Table2
(
ID,END_DTTM
)
VALUES (
@FK,@END_DTTM
)
END
FETCH NEXT FROM @SelectCursor INTO @ID,@FK,@END_DTTM;
END
CLOSE @SelectCursor;
DEALLOCATE @SelectCursor;
COMMIT TRANSACTION;
END TRY
BEGIN CATCH
IF @@TRANCOUNT > 0
ROLLBACK TRANSACTION;
DECLARE @ErrorNumber INT = ERROR_NUMBER();
DECLARE @ErrorLine INT = ERROR_LINE();
DECLARE @ErrorMessage NVARCHAR(4000) = ERROR_MESSAGE();
DECLARE @ErrorSeverity INT = ERROR_SEVERITY();
DECLARE @ErrorState INT = ERROR_STATE();
PRINT 'Actual error number: ' + CAST(@ErrorNumber AS VARCHAR(10));
PRINT 'Actual line number: ' + CAST(@ErrorLine AS VARCHAR(10));
PRINT 'Actual message: ' + CAST(@ErrorMessage AS VARCHAR(4000));
PRINT 'Actual severity: ' + CAST(@ErrorSeverity AS VARCHAR(10));
PRINT 'Actual state: ' + CAST(@ErrorState AS VARCHAR(10));
Insert into ERROR_LOG
(
SOURCE_PRIMARY_KEY
,ERROR_CODE
,ERROR_COLUMN
,ERROR_DESCRIPTION
)
VALUES
(
null,
@ErrorNumber,
@ErrorState,
@ErrorMessage,
'Error!'
);
Throw;
-- RAISERROR(@ErrorMessage, @ErrorSeverity, @ErrorState);
END CATCH
END;
I tried using cte but I didn't see any performance gain with it, in fact it was tad slower than the cursors themselves.
Is there a better way to achieve the above using set based operation still process every row from table 1 and update 2 so that temporal table picks up the update and tracks the changes?
Upvotes: 0
Views: 293
Reputation: 3863
I am not sure how you are running the update SQL, but I'll describe the process that I use to track changes in Oracle.
I setup a trigger on the table that I want to audit. I create another table with the same columns, one set prefixed with OLD_ and the other prefixed NEW_. In the Oracle trigger, you can reference the new row and old row. I run an insert into the audit table with the old and new value, DML action type, and the TIMESTAMP. Additionally, I'll add the database user and if possible the application user that requested the change.
In the current RAC cluster and in our ancient 9i AIX server, I could never notice any performance degradation.
Additionally, if the transaction is rolled back, it won't insert the audit record as the trigger is inside the transaction.
Don't let people tell you NOT to use SQL triggers. While you don't want to do "crazy" things with triggers (like running queries or calling Web services), it is the perfect application for a trigger (I normally use them to add a last updated date to a row. I don't trust the application layer for accurate information).
Upvotes: 1
Reputation: 1269623
Oh, is that all you want?
insert into table2(ID, END_DTTM)
select fk, max(END_DTTM)
from table1 t1
group by fk;
Upvotes: 0