Reputation: 93
I am trying to implement Kernel Ridge Regression in R.
The formula is:
alpha <- ((lambda.I + K)^(-1)) * y
Lambda = 0.1. I = identity matrix the same size as K. y is a feature vector that has the same number of rows as K.
So I tried this in R:
I <- diag(nrow(df_matrix)
lambda <- 0.1
alpha <- (lambda * I + df_matrix) ^ (-1) * df_vector
I get the following error
Error in (0.1 * I + df_matrix)^(-1) * df_vector : non-conformable arrays
Here's some information on my dataset
> nrow(df_matrix)
[1] 8222
> ncol(df_matrix)
[1] 8222
> nrow(df_vector)
[1] 8222
> nrow(I)
[1] 8222
> ncol(I)
[1] 8222
> class(df_matrix)
[1] "matrix"
> class(df_vector)
[1] "matrix"
Upvotes: 4
Views: 23838
Reputation: 581
To transpose a matrix, the matrix has to be quadratic and the determinant has to be different to zero. If you matrix df_matrix fullfil these requirements, then
alpha <- solve(lambda * I + df_matrix) %*% df_vector
Upvotes: 1
Reputation: 66874
You need to use matrix multiplication, %*%
. In addition you also need to use solve
to compute inverses as raising to the power minus one will just do element wise reciprocals. e.g.:
K <- matrix(runif(9),3)
y <- matrix(runif(3),nrow=3)
solve(lambda*diag(nrow(K))+K) %*% y
[,1]
[1,] 0.50035075
[2,] -0.04985508
[3,] 0.74944867
Upvotes: 2
Reputation: 2718
I bet you want to have here matrix inversion, which is solve(m)
, instead of element-wise (m^(-1)
). Also, matrix multiplication (%*%
) instead of element-wise (*
). So, altogether is
alpha <- solve(lambda * I + df_matrix) %*% df_vector
Upvotes: 6