Jona
Jona

Reputation: 562

Trigger to delete rows from table

I'm trying to create a Trigger for when records in a log table exceed a certain number of rows I want to start purging them by x amount of old records and keeping the new records by a certain date.

Current Query (Pseudocode):

    declare @date datetime
    SET @date = 'certaindate'

    CREATE TRIGGER PURGE_USERS ON UsersAD FOR DELETE 
    AS DELETE 
    FROM UsersAD WHERE ROWCOUNT(UsersAD) > x and updateDate < @date

    GO
    -- UPDATE: Might have to partition instead of delete.

I'm assuming you get my point but neither do I know how to create the trigger in order to start deleting the records nor do I know a good way to figure out when to delete the records.

Appreciate the help,

Thanks.

Upvotes: 0

Views: 689

Answers (3)

Jona
Jona

Reputation: 562

For those are that wondering, Instead of creating a Trigger I decided to make a Function but in SQL Server you must create a Function that will have some sort of return value which is not what I wanted in my case. So after doing some more research, turns out I'd have to make a Stored Procedure which will then be executed on a daily basis. Within my query, just implemented it so it will clean up the logs after each week since it gives me enough time to look at what happened to it within the week.

Thanks again.

Upvotes: 0

Eric Brandt
Eric Brandt

Reputation: 8111

Too long for a comment.

Typically you wouldn't want to tie your data retention policy to the number of rows in the table, but rather to the time frame that the logged data remains relevant.

Based on your use case and your business rules and, possibly, applicable laws, you'll want to determine if you need the logs for x number of hours, days, weeks, months, or years. Once you've made that determination, it's just a matter of setting up a job on an appropriate schedule to remove the records that are no longer needed.

Knowing your retention period will also help in deciding on the granularity for defining table partitions, which, as Gordon Linoff notes, is likely the most efficient way to handle the data removal long-term. It's more effort up front, but your future self will thank you for taking the time to do it when you aren't troubleshooting unexpected slowdowns later on.

Upvotes: 3

Gordon Linoff
Gordon Linoff

Reputation: 1271211

A better strategy is probably to partition the table. Partitioning is better than delete, because you can drop a partition much more efficiently than deleting a bunch of records.

You would especially not want to do a large number of deletes in a trigger, because the table may remained locked for the duration of the insert. You expect an insert to be fast . . . and suddenly it is really slow.

Normally, partitions would be used to keep a fixed time period of data. Say, you could have a separate partition for each week and keep the last 100 weeks of data.

You can learn more about partitioning in the documentation.

Upvotes: 4

Related Questions