HappyRave
HappyRave

Reputation: 175

Sparse matrix-matrix multiplication

I'm currently working with sparse matrices, and I have to compare the computation time of sparse matrix-matrix multiplication with full matrix-matrix multiplication. The issue is that sparse matrix computation is waaaaay slower than full matrix computation.

I'm compressing my matrices with the Compressed Row Storage, and multiplicating 2 matrices is very time consuming (quadruple for loop), so I'm wondering if there is a better compression format more suitable for matrix-matrix operation (CRS is very handy with matrix-vector computation).

Thanks in advance!

Upvotes: 4

Views: 4310

Answers (2)

user1101165
user1101165

Reputation: 59

You might look at the paper EFFICIENT SPARSE MATRIX-MATRIX PRODUCTS USING COLORINGS http://www.mcs.anl.gov/papers/P5007-0813_1.pdf for a discussion of how to achieve high performance with sparse matrix matrix products.

Upvotes: 2

Adam
Adam

Reputation: 17339

It's usually referred to as "Compressed Sparse Rows" (CSR), not CRS. The transpose, Compressed Sparse Columns (CSC) is also commonly used, including by the CSparse package that ends up being the backend of quite a lot of systems including MatLAB and SciPy (I think).

There is also a less-common Doubly-Compressed Sparse Columns (DCSC) format used by the Combinatorial BLAS. It compresses the column index again, and is useful for cases where the matrix is hypersparse. A hypersparse matrix has most columns empty, something that happens with 2D matrix decomposition.

That said, yes there is more overhead. However your operations are now dominated by the number of nonzeros, not the dimensions. So your FLOPS might be less but you still get your answer quicker.

Upvotes: 4

Related Questions