Reputation: 1
I'm trying to make a determinant calculator by using the cofactor method. Here is my code:
void det(int r,int c) {
int check = 0;
if(r==c) {
check = 1;
} else check = 0;
if(check) {
int matrix[r][c], co[r-1][c-1];
int i,j,k,t;
int result = 0;
int det = 0;
printf("Enter your SQUARE matrix:\n");
for(i=0;i<r;i++) {
for(j=0;j<c;j++) {
scanf("%d", &matrix[i][j]);
}
}
for(i=0;i<r;i++) {
int p=i;
for(j=0;j<c;j++) {
int q=j;
for(k=0;k<r;k++){
for(t=0;t<c;t++) {
if(k!=p && t!=q) {
co[k-1][t-1] = matrix[i][j];
}
}
}
int det = (co[0][0] * co[1][1]) - (co[0][1] * co[1][0]);
result += (matrix[0][j] * pow(-1,(i+1)+(j+1))) * det;
}
}
printf("%d", result);
}
}
I can't get the output right. Here's my example matrix:
3x3
1 2 4
-1 2 2
5 1 3
I want the output to be:
-14
Upvotes: 0
Views: 131
Reputation: 112374
I'm not going to attempt to understand your code, since it has more nested for loops than are needed for this problem, and since it is not recursive and appears to have the potential to work only on 3x3 matrices.
This needs to be recursive. Or, equivalently, have a stack to keep track of the work. For an n by n matrix where n > 2, you break the problem into n pieces. There are n cofactor matrices built from all combinations of the n columns excluding the top element. Recursively compute the determinant of those. Then you can multiply each of those determinants by the top row value of the column that was omitted, and times 1, -1, 1, ... alternating in order for the top row elements, and add those up. That is your determinant. When n is 2, the determinant is the product of the diagonal elements minus the product of the off-diagonal elements. If you're given a 1x1 matrix, it's just the one element.
There are three nested for loops and a recursive call. Those are preceded by tests for n < 3, returning the appropriate value for those cases.
So something like this:
int det(int n, int m[][n]) {
if (n < 1)
return 0;
if (n == 1)
return m[0][0];
if (n == 2)
return m[0][0] * m[1][1] - m[0][1] * m[1][0];
int d = 0, s = 1;
int c[n - 1][n - 1];
for (int k = 0; k < n; k++) {
for (int i = 0; i < n - 1; i++)
for (int j = 0; j < n - 1; j++)
c[i][j] = m[i + 1][j < k ? j : j + 1];
d += s * m[0][k] * det(n - 1, c);
s = -s;
}
return d;
}
Or, a little more efficiently with only two levels of for loops, since only one column of the cofactor matrix needs to be replaced each time:
int det(int n, int m[][n]) {
if (n < 1)
return 0;
if (n == 1)
return m[0][0];
if (n == 2)
return m[0][0] * m[1][1] - m[0][1] * m[1][0];
n--;
int c[n][n];
for (int i = 0; i < n; i++)
for (int j = 0; j < n; j++)
c[i][j] = m[i + 1][j + 1];
int d = 0, s = 1, k = 0;
for (;;) {
d += s * m[0][k] * det(n, c);
if (k == n)
return d;
s = -s;
for (int i = 0; i < n; i++)
c[i][k] = m[i + 1][k];
k++;
}
}
Normally this would be done with double
's, not int
's, for the matrix elements.
Note that this is the worst possible way to compute a determinant when n gets large, since it will take O(n!) time. Far, far better is to reduce or decompose the matrix, which will take O(n3) time.
Upvotes: 2