Reputation: 7614
Double dblValue = 0.0001;
Boolean a = (dblValue >= (1 / 1000));
Boolean b = (dblValue >= 0.001);
Console.WriteLine("dblValue >= (1 / 1000) is " + a);
Console.WriteLine("dblValue >= 0.001 is " + b);
Console.ReadLine();
The above C# code evaluates 'a' to true and 'b' to false. In VB.NET, the equivalent code evaluates 'a' to false and 'b' to false. Why would 'a' evaluate to true?
Is there an implicit conversion I'm missing here - and why doesn't it affect VB.NET (Strict)?
Upvotes: 1
Views: 320
Reputation: 245499
1 and 1000 are both integers, so the result will be an integer (0 in this case). You need to force the use of doubles to complete the math.
Boolean b = (dblValue >= ((double) 1/(double) 1000));
or
Boolean b = (dblValue >= (1d / 1000d));
Will give you the result you're expecting.
Upvotes: 5
Reputation: 1503869
The expression 1 / 1000
is evaluated (at compile time in this case, although it's irrelevant really) using integer arithmetic in C#, so evaluates to 0. Use 1.0 / 1000
1 instead to force double
arithmetic to be used.
I believe VB always uses floating point arithmetic for /
, and you have to use \
if you want to perform division using integer arithmetic, which is why you're seeing different behaviour there.
1 Or, as per comments, use 1d
or (double) 1
or anything else that will force either of the operands to be considered to be of type double
.
Upvotes: 14