biswajit
biswajit

Reputation: 3397

Why do I get different results when using a function versus a macro?

I'm using DevCPP IDE and I found that while programming in c,

Value returned by:

float f(float x)
{
      return 1/(1+x*x);
}

and value returned by f(x) if it's defined as:

#define f(x) 1/(1+x*x)

are different.

Why am I getting different results in these cases?

EDIT:

Here's my code for which I'm getting the anomaly:

main()    
{
      int i;

      float a=0, b=1, h, n=12, s1, s2=0;

      h=(b-a)/n; //step length

      s1=f(a)+f(b);

      for(i=1;i<=n-1;i++)

      {

        s2+=f(a+(i*h));

      }

      s1=(s1+2*s2)*h/2;

      printf("sum: %f", s1);   

      getch();
}

OUTPUT 1: 0.693581 (using MACRO)

OUTPUT 2: 0.785109 (using function)

Upvotes: 1

Views: 209

Answers (3)

templatetypedef
templatetypedef

Reputation: 373152

There are many possibilities here. Here's a few.

In the function version, the argument is explicitly typed as a float. This means that if you call

f(1);

then 1 is converted to the float 1.0f as the argument. Then, when 1 / (1 + x * x) is computed, it evaluates to 1 / 2.0f, which comes out to 0.5f. However, in the macro version, f(1) would be evaluated as 1 / 2 using integer division, yielding the value 0.

Second, in the function version, the argument is evaluated only once. This means that

int x = 0;
f(x++);

will increment x to 1, then pass in the value 0. The result will then be 1.0f. In the macro version, however, the code expands to

1 / (1 + x++ * x++)

This causes has undefined behavior because there is no sequence point between the evaluations of x++. This expression could evaluate to anything, or it could crash the program outright.

Finally, the function version respects operator precedence while the macro does not. For example, in the function version, calling

f(1 - 1)

will call f(0), evaluating to 1.0f. In the macro version, this expands to

  1 / (1 + 1 - 1 * 1 - 1)
= 1 / (1 + 1 - 1 - 1)
= 1 / 0

This causes undefined behavior because of a divide-by-zero error.

The simple way to avoid this is to not use macros to define functions. Way back in the Bad Old Days this was a standard practice, but now that C has inline functions and compilers are way smarter, you should prefer functions to macros. They're safer, easier to use, and harder to mess up. They're also type aware and don't evaluate arguments multiple times.

Hope this helps!

Upvotes: 3

Fiddling Bits
Fiddling Bits

Reputation: 8861

#define f(x) 1/(1+x*x)

should be defined as

#define f(x) 1/(1+(x)*(x))

Upvotes: 1

tckmn
tckmn

Reputation: 59343

Try this:

#define f(x) (1/(1+x*x))

A #define basically sticks the code where the #define'd name was, so order of operations and precedence will apply.

Upvotes: 0

Related Questions