Reputation: 77
I am trying to convert the Oracle script to SQL Server script. But it seems my MIN()
function is not working as expected as it is being executed in Oracle.
For SQL Server :
SELECT MIN(v)
FROM (VALUES ('20013E17587A1_2'), ('20013E17587_2')) AS value(v);
Result: 20013E17587_2
However,
For ORACLE :
SELECT MIN(t.value)
FROM tab t;
Result: 20013E17587A1_2
I am getting this as a result. Can somebody explain why is this difference and what can be done to have same result?
Upvotes: 3
Views: 184
Reputation: 65323
Because Oracle looks for the ASCII value for each character during the comparison of each respective character ordered within the string while performing alphanumeric sorting. This is called binary sort which's default for Oracle DB.
ASCII('A')
equals 65, and ASCII('_')
equals 95. If the string was 20013E17587.2
instead of 20013E17587_2
, then you'd get 20013E17587.2
as result, since ASCII('.')
equals 46 which is less than 65.
Upvotes: 4
Reputation: 28789
Different sort rules. You're asking the database whether it considers _
to come before or after A
. By default, Oracle uses a binary sort (and the code point 65 belonging to A
is less than code point 95 belonging to _
) while SQL Server uses the default collation of the database, which will be a linguistic ordering where _
is considered to precede any letter. If you want SQL Server to exhibit identical behavior, use something like
SELECT MIN(v COLLATE Latin1_General_BIN2)
FROM (VALUES ('20013E17587A1_2'), ('20013E17587_2')) AS value(v);
Actual correct placement of the COLLATE
depends on your "real" query, which I presume this isn't -- you may want to change the collation of the column itself in the CREATE TABLE
, for example.
Upvotes: 4