Yeynno
Yeynno

Reputation: 331

large sparse matrix, svd

I want to calculate SVD , but I didn't find good java library for this. Now, I have data store in hashmap, because matrix didn't fit into memory due to the fact that sizes are about 400 000 X 10 000 and most of them are 0. I tried MTJ, JBLAS, Jama and others but most of them don't support sparse matrices or are too slow. I need this calculation to be done in max 2-3 minutes. Can somebody recommend me something ? I read also about irbla in R, but it is possible to send my data from java to R, make calculations and send it back to my java program ?

Upvotes: 0

Views: 773

Answers (1)

tgogos
tgogos

Reputation: 25152

I faced a similar problem while trying to apply Non-negative Matrix Factorization (NNMF) and Probabilistic Latent Semantic Analysis to large sparse term-document matrices. I tried Jblas1 and Jama2 but I finally used Matlab. And because I was writting the whole app in Java I ended up calling Matlab with Java through the Matlab Runtime Compiler (MRC).

What to do:
Matlab has a feature called Matlab Builder JA. Use this tool to produce a .jar file which has your matlab code and can then be called by your java program. This .jar needs the MRC to work.

1Jblas was much faster than Jama but I had problems and couldn't make it run with jdk for 64bits.
2 Jama used doube[][] and this caused problems with memory.
Maybe the above have changed because I am reffering back to July 2012 and don't have clear image about now.

Upvotes: 2

Related Questions