Reputation: 17394
In my current job, I have to run SQL script once in a while to update my database. Mostly they are to update certain records that did not work properly.There is a page functionality to do the same but that is a bit longer route so I do it directly against the database.
I just ran an update query and accidentally updated all of my 30,000+ records. Luckily I was working in my test database.
//Particular query that I am talking about is this
update customers set
customer_id = 100 // where clause is missing and it will update all records
I was wondering, Is there a way to protect the database against such bulk update, may be a trigger that will half if update is affecting more than 500 records? Is there any way to protect database from accidental ill formatted queries. I am particularly interested in updates though.
Am I doing something really wrong when I use SQL Script directly against my production server. I am new and I need expert advice.
Upvotes: 1
Views: 742
Reputation: 10115
Your query should be looking like below.
SET NOCOUNT ON
SET XACT_ABORT ON
Begin Try
Begin Tran
//SQL query
Commit Tran
End Try
Begin Catch
Rollback Tran
End Catch
By doing the exception handling will rollback the datatbase to previous state due to any runtime errors...
Upvotes: 0
Reputation: 1647
Looking at your query I assume that your Customers table doesn't have a Primary Key with Identity turned on for the Customer_Id, having both these set on the table would certainly have stopped that script from completing.
I would suggest that you wrap your query in a transaction. Initially running the script with a rollback statement testing the output before you call Rollback. If the results are as you expected, replace the rollback with a commit and re-run the query.
EXAMPLE
BEGIN TRAN
UPDATE Customer SET Customer_ID = 100
SELECT * FROM Customer
ROLLBACK
--If results look OK comment out ROLLBACK and comment in the line below
--COMMIT
Upvotes: 0
Reputation: 4585
Let me address (2). I think your suggestion for (1) might help, but let's talk planning first.
Whether using scripts directly is "really wrong" or not depends on the risk level. Is this data mission critical? Do you have a good backup, including transaction logs, so you can recover to the minute before your mistake? Can the database be down for the hour it will take you to get it back up?
Assess the risk and take appropriate action. If the potential for error or the cost of the error is big, work harder to mitigate the risk. If either of these factors is high, force yourself to use the tightly coded interface. Medium level - write your script with appropriate variables and save it, so that next time you can't accidentally drop the WHERE clause, or make sure you shoot a backup before running your script. And if recovering from a mistake is cheap and easy, keep writing your one-off queries and hope for the best.
Upvotes: 2
Reputation: 8729
Its generally not a good idea to have update access to a production database. All necessary updates should be managed using stored procedures. That way, you really have to think about what you're doing before doing it..
Also, as @Neville K mentioned, transactions are good. You can setup your stored procedures to refuse to do anything unless a transaction is in process, and if you never have a commit in your procs, you can always rollback. That said, once you commit, its game over anyway..
There's no hard and fast way round this, the best you can do is make it difficult for yourself (and others!) to screw up..
Upvotes: 1
Reputation: 89741
You can certainly put an update trigger on the table and throw an exception if the INSERTED/DELETED pseudo tables have a certain large number of rows - this will cause any transaction being attempted to roll back.
Generally, for production, you want to have some controls, such as having scripts like these in tested procs, but obviously every control is eventually going to have a limit - otherwise you'd never be able to insert or change any data.
Upvotes: 2