Reputation: 26919
I wrote this:
public static decimal Average(int a, int b)
{
return (a + b) / 2;
}
public static void Main(string[] args)
{
Console.WriteLine(Average(2, 1));
Console.ReadKey();
}
but it returns 1
. But it should return 1.5
How can I fix it to return 1.5 ?
Upvotes: 1
Views: 15341
Reputation: 11
Solutions upper are wrong. You must check next test cases when work with types:
TEST CASE
0 100
20 -21
21 20
2147483647 2147483647
-2147483647 2147483647
-2147483647 -2147483647
#include <stdio.h>
#include <limits.h>
int average(int a, int b) {
if ( a < 0 && b < 0 ) {
b -= a;
b /= 2;
b += a;
} else if ( a < 0 ) {
b += a;
b /= 2;
} else if ( b < 0 ) {
b += a;
b /= 2;
} else if ( a > b ) {
a -= b;
a /= 2;
a += b;
} else {
b -= a;
b /= 2;
b += a;
}
return b;
}
int main(){
int a = INT_MAX;
int b = INT_MAX;
scanf("%d %d", &a, &b);
printf("Max INTEGER: %d\n", INT_MAX);
printf("Avarage INTEGER: %d\n", average(a, b));
return 0;
}
Upvotes: 1
Reputation: 2770
It returns 1 as integers do not have decimal points and hence the answer is truncated. Eg if the result was 1.99 the result would be 1, eg floor(result).
If you require decimal points you need to use floats or cast your integers to floats. If you require higher precision a double precision floating point could be used (double).
Something else to consider is there are libraries that will do this for you. Eg List.Average() which will average multiple variables.
edit: See this question for a more detailed answer that is specific to division
What is the behavior of integer division?
Upvotes: 1
Reputation: 405
Missing typecasting on the Average function
public static decimal Average(int a, int b)
{
return (decimal)(a + b) / 2;
}
public static void Main(string[] args)
{
Console.WriteLine(Average(2, 1));
}
Upvotes: 10