Reputation: 1
I need help making a C program that calculate the root intervals of a given equation. I made a program based on the concept that if f(a)f(b) < 0, then there exists at least one root in (a, b).
#include <stdio.h>
#include <string.h>
#include <conio.h>
#include <math.h>
#define f(x) x*x + 2*x + 1
void RootInterval(){
float a[10];
float b[10];
float begin, end;
begin = -50;
end = -50;
int count, i = 0;
while(begin < 50){
if(f(begin) >= 0){
while(f(end) >= 0){
end = end + 1;
}
a[i] = begin;
b[i] = end;
begin = end + 1;
i++;
}
else {
while(f(end) < 0){
end = end + 1;
}
a[i] = begin;
b[i] = end;
begin = end + 1;
i++;
}
}
count = sizeof(a) / sizeof(a[0]);
printf("Root interval is:\n");
for(i = 0; i < count; i++){
printf("(%f, %f) ", a[i], b[i]);
}
}
int main(){
RootInterval();
return 0;
}
But when I run the program, nothing appears on the screen. What am I doing wrong here?
I tried different intervals and different conditions. I was expecting all root intervals in the range of (-50, 50).
Upvotes: 0
Views: 133