Cevizli
Cevizli

Reputation: 153

Variable acting strange(like floating point )

I have this code;

 static int test = 100;
        static int Test
        {
            get
            {
                return (int)(test * 0.01f);
            }
        }

output is : 0

But this code returns different

static int test = 100;
    static int Test
    {
        get
        {
            var y = (test * 0.01f);
            return (int)y;
        }
    }

output is : 1

Also I have this code

  static int test = 100;
    static int Test
    {
        get
        {
            return (int)(100 * 0.01f);
        }
    }

output is : 1

I look at IL output and I dont understand why C# doing this mathematical operation at compile time and output different? enter image description here

What is difference of this two code? Why I decided to use variable result is changing?

Upvotes: 6

Views: 178

Answers (1)

Patrick Hofman
Patrick Hofman

Reputation: 157116

Because the compiler tricks you. The compiler is smart enough to do some math already so it doesn't need to do that on run-time, which would be pointless. The expression 100 * .01f is calculated in the compiler, without the lack of precision on the float, which breaks you up on run-time.

To prove this, try to make the static test a const. You will see the compiler is able to do the math for you on compile time then too. It has nothing with writing to a variable first, as in your sample. Run-time vs. compile-time is.

Upvotes: 2

Related Questions