Reputation: 2753
I have a stored procedure which does a lot of parsing and inserting. It is triggered from the front end for each row in a particular table.
For the sake of modularity and readability I have been tasked with changing the procedure such that the parsing occurs in one stored procedure and inserting in another stored procedure.
Now to my dilemma, I was thinking that the best way for me to go about doing this is to split the stored procedure and call the different stored procedures from the original stored procedure. By doing so, the time taken to change the code is very less but my concern is the number of calls made to the database.
If there are N rows, then earlier the number of calls made to the database were N. But now based on the new approach, the number of calls will be 2*N.
Another way is to write all the table data into a temp table and then to do the needful. This way the calls to the database are reduced but it creates another problem with the id
s. Time taken to change that will probably take hours together.
My question is, if I were to retain the original logic and split them into multiple stored procedures, how bad will it affect the performance?
The number of rows from the front end typically won't exceed a 100.
Thanks.
Upvotes: 0
Views: 95
Reputation: 10098
Like I've said in the comment, it will hurt performance; how much exactly is hard to tell without the code.
If you really have to refactor it, try to extract some of the code into inline table-valued functions. Not the multistatement TVF, not the scalar TVFs, not the stored procedures, but inline TVFs.
The reason is that the optimizer will treat inline TVFs like views and merge it and optimize it together with the outer query, so it'd have a rather good chance to be optimized properly.
Upvotes: 1