xTMx
xTMx

Reputation: 77

my program isn't displaying the right average,it is giving me only zeros in decimal(New to Programming)

here is the code as written in visual studio

    #include <stdio.h>

    void main() 
    {
      int n,i,num,s;
      float av;
    printf("How Many numbers?");
    scanf("%d",&n);
    s=0;

    for(i=1;i<=n;i++){

        printf("enter number #%d  : ",i);
        scanf("%d", &num);

        s=s+num;    

    }

    av=s/n;
        printf("The Average is %f",av);
    getchar();
    }

i really don't know why it isnt displaying the right average :/

Upvotes: 0

Views: 488

Answers (2)

πάντα ῥεῖ
πάντα ῥεῖ

Reputation: 1

av=s/n; Lookup "integer division". You probably want to use av=(float)s/n;

Division of two integer values doesn't automatically convert to a float value, unless you use a cast.

Upvotes: 1

Ayushi Jha
Ayushi Jha

Reputation: 4023

The problem is here: av=s/n; you are storing the result of an integer division into a float, there will be some data loss. A simple solution: use typecasting->

av=(float)s/n;

or

av=s/(float)n;

Another alternative: make either s or n a float.

Upvotes: 2

Related Questions