BillRuhl
BillRuhl

Reputation: 395

How to drill down to a bad record in a large data set?

I’m looking for a more efficient way to drill down to one record in large data sets. Every once in a while I have to find a record that made it past the cleaning process and has some junk data in one of the fields. Currently I wind up dropping back to a cursor, that’s wildly inefficient, to loop over the data set. Here is an example of what I'm talking about:

CREATE TABLE #t(
PK INT PRIMARY KEY IDENTITY(1,1),
SomeVal VARCHAR(50) NULL
)

INSERT INTO #t(SomeVal)
VALUES('1.2'),('3.4'),('5.6'),('7.8 Junk.....')


DECLARE x CURSOR FOR SELECT PK, SomeVal FROM #t
DECLARE @y INT
DECLARE @z VARCHAR(50)

OPEN x
FETCH NEXT FROM x INTO @y, @z

WHILE @@FETCH_STATUS = 0
    BEGIN
        BEGIN TRY
            SELECT CAST(@z AS MONEY) --Create the error
        END TRY
        BEGIN CATCH
            PRINT('The Primary key is: ' + CAST(@y AS VARCHAR(50)) + ' and the offending value is: ' + @z) --Catch the records primary key and the value that causes the error
        END CATCH

        FETCH NEXT FROM x INTO @y, @z
    END

CLOSE x
DEALLOCATE x
DROP TABLE #t

Does anyone have a pro tip that makes this faster? I know that cursors are bad news most of the time but I had to come with something in pinch...I want to be better prepared the next time it comes up.

Upvotes: 0

Views: 328

Answers (1)

TRY_CAST/TRY_CONVERT will return NULL on a fail and are designed specifically for that scenario.

SELECT TRY_CAST(@z AS MONEY)

Upvotes: 2

Related Questions