Reputation: 16365
Consider the c# below code:
public class Program
{
public static void Main(string[] args)
{
//Your code goes here
Console.WriteLine(double.MinValue < double.MinValue + 1);
Console.WriteLine(int.MinValue < int.MinValue + 1);
}
}
The output will be:
False
True
I know that
Console.WriteLine(double.MinValue + 1);
Console.WriteLine(double.MinValue);
will print the same -1,79769313486232E+308
value.
My question is: Why this "unexpected" and curious behavior occurs? I hope the first line returns true
too.
The below Java program, for example, returns the expected true
:
public class MyClass {
public static void main(String args[]) {
System.out.println(Double.MIN_VALUE < Double.MIN_VALUE + 1);
}
}
Why this different behavior?
Upvotes: 1
Views: 344
Reputation: 840
While both Java and C# implement the IEEE 754 specification for floating point computations, the minvalue constants are not the same.
You can try what happens (or rather "won't happen") here: http://weitz.de/ieee/
Upvotes: 1
Reputation: 198033
The two min values are different things.
In Java, MIN_VALUE
is the name for the smallest positive double
value. In C#, MinValue
is the name for the double value less than all other finite values; in Java, that value would be -Double.MAX_VALUE
.
And that value is so massively negative that adding 1
to it gets lost in the rounding. double
doesn't have enough precision to represent MinValue
differently from MinValue+1
. It's, roughly, -1800000....00 where there are 307 zeroes. double
can only represent, roughly, seventeen decimal digits of precision. So adding one gets lost in the rounding.
Upvotes: 11