Reputation: 4386
I was wondering how does SqlParameter
know the type when it is not specified?
For example, when instantiating a parameter object you don't need to specify the DB Type. Somehow when the command is executed .net handles this for you. Behind the scenes is there some sort of conversion taking place? Is there a cost when you do / don't specify the type explicitly?
Upvotes: 6
Views: 3648
Reputation: 403
The type is figured out based on the data type of the parameter value that is passed so the SqlParameter constructor.
You can find more info at this link Configuring Parameters and Parameter Data Types
Go to the "Specifying Parameter Data Types" section.
Upvotes: 0
Reputation: 294427
Is there a cost when you do / don't specify the type explicitly?
Yes, and the cost can be huge. It has nothing to do with client side cast, but everything with server side execution. You should read Under the Table - How Data Access Code Affects Database Performance. Some problems that can occur are:
SqlCommand.Parameters.AddWithValue("@someparam", "somestring")
the resulted parameter is NVARCHAR
type. If your query has a clause WHERE somecolumn = @someparam
and somecolumn
is of type VARCHAR and indexed then the type mismatch will prevent the index use.SqlCommand.Parameters.AddWithValue("@someparam", "somestring")
it will result in a query that has a parameter type NVARCHAR(10)
and when you use SqlCommand.Parameters.AddWithValue("@someparam", "anotherstring")
it will result in a query that has a parameter of type NVARCHAR(13)
and this will be considered a different query, and produce a different plan. In time you can pollute the server plan cache with hundreds of plans, one for each possible length of the parameter. This can gets exacerbated when multiple parameters are present, each combination of lengths will create its own planThere are possible problems with other types too, but strings are the most notorious culprits. Again, read the linked article, is highly relevant.
Upvotes: 5