Reputation: 3284
Here's an example of the scenario:
CREATE TABLE dbo.TestConversionTable
(
Value VARCHAR(MAX)
)
INSERT INTO dbo.TestConversionTable VALUES ('1')
INSERT INTO dbo.TestConversionTable VALUES ('foo')
CREATE VIEW dbo.TestConversion
AS
SELECT CONVERT(BIGINT,Value) Value
FROM dbo.TestConversionTable
WHERE Value <> 'foo'
GO
SELECT * FROM dbo.TestConversion --this works
SELECT * FROM dbo.TestConversion WHERE Value = 1 --Error converting data type varchar to bigint.
SELECT * FROM dbo.TestConversion WHERE Value = '1' --Same thing, this doesn't work either.
I would expect both the scenarios to work since the view already filters out the bad data. What's even stranger is that the second query does not work either. I can only grab an estimated execution plan on both of them (since I can't actually run the query without error), and they are identical. Based on the estimated plan, the bad data would be filtered first before the WHERE Value = 1
is applied.
Edit: To test the query plans, I changed the CONVERT to TRY_CONVERT instead. The result of the plans are still identical, and it looks like the filter occurs before the conversion:
Upvotes: 0
Views: 103
Reputation: 28900
I would expect both the scenarios to work since the view already filters out the bad data
you are using a view and view just stores definition ..At run time,your queries will be both expanded and this comparison will definitely fail
SELECT CONVERT(BIGINT,Value) Value
you may also think,that where clause should filter out bad data,but this may or may not happen..this has been explained by Paul white here: TSQL divide by zero encountered despite no columns containing 0
SQL is a declarative language; you write a query that logically describes the result you want, but it is up to the optimizer to produce a physical plan. This physical plan may not bear much relation to the written form of the query, because the optimizer does not simply reorder 'steps' derived from the textual form of the query, it can apply over 300 different transformations to find an efficient execution strategy.
There is also a connect item raised,but there are different opinions as you see in comments on the connect item
also see below example from connect item which is similar to yours and it is also relies on filtering out bad data
create table albert(a int NOT NULL,
b varchar(23) NOT NULL)
create table stina (a int NOT NULL)
go
insert albert (a, b)
values (1, '99'),
(2, 'Gurka'),
(3, '89')
insert stina (a) values(1), (3), (9), (12)
go
SELECT a.a, a.b + 100
FROM albert a
JOIN stina s ON s.a = a.a
The above query fails with below error
This query fails with "Conversion failed when converting the varchar value 'Gurka' to data type int.", despite this row would never appear in the output
Coming to your edit and below point
Edit: To test the query plans, I changed the CONVERT to TRY_CONVERT instead. The result of the plans are still identical, and it looks like the filter occurs before the conversion:
The plans you provided are estimated ones not actual ones.see this answer on how they can be different :Differences between estimated and actual execution plans
Also check out this answer by Martin Smith for more examples
https://stackoverflow.com/a/7192951/2975396
Upvotes: 2