user_dsp
user_dsp

Reputation: 67

Autocorrelation with linear indexing of 2D matrix

I have a matrix xM which is indexed linear and when I try to calculate the autocorrelation for every column of it, I get an error that the max lag must be integer. Maybe I shouldn't use the ind2sub function? Please help, thanks in advance.

xM =  x( idx );
[i,j] = ind2sub(size(xM),idx);
xc(1:i,1:j)=xcorr(xM(1:i,1:j),xM(1:i,1:j));

Upvotes: 1

Views: 425

Answers (1)

Wooly Jumper
Wooly Jumper

Reputation: 443

xcorr doesn't take matrices. It probably thinks you are calling the function with the first xM being your signal (which may be a matrix), and the second xM being the MaxLag option. That would only happen if the second input is actually a scalar. Otherwise you get other errors "When B is a vector, A must be a vector."

Upvotes: 1

Related Questions