Jacob Eggers
Jacob Eggers

Reputation: 9322

Convert vector of indices into matrix

I want to convert a vector of indices into a matrix with ones in the columns of the indices.

x = [2;1;3;1];
m = someFunc(x,3)
% m =
%
%   0   1   0
%   1   0   0
%   0   0   1
%   1   0   0

Upvotes: 6

Views: 1472

Answers (4)

jlh
jlh

Reputation: 4677

You can use accumarray which makes this very easy, like so:

accumarray([ (1:length(x))', x ], 1, [4, 3])

The 1:length(x) part specifies into which rows the ones go, and x into which columns.

Upvotes: 0

Hans Then
Hans Then

Reputation: 11322

I tested the sub2ind function, but on the coursera Machine Learning forum I was pointed to this beauty.

m = eye(num_cols)(x,:);

It uses the identity matrix to select the appropriate column based on the value in x.

Upvotes: 15

Charity Leschinski
Charity Leschinski

Reputation: 2906

I had a very similar question, so I didn't want to open a new one. I wanted to convert a row vector of indices into a matrix with ones in the rows (instead of columns) of the indices. I could have used the previous answer and inverted it, but I thought this would perform better with very large matrices.

octave> x = [2 1 3 1];
octave> m = setRowsToOne(x, 3)
m =

   0   1   0   1
   1   0   0   0
   0   0   1   0

I couldn't see how to use sub2ind to accomplish this, so I calculated it myself.

function matrixResult = setRowsToOne(indexOfRows, minimumNumberOfRows)
   numRows = max([indexOfRows minimumNumberOfRows]);
   numCols = columns(indexOfRows);
   matrixResult = zeros(numRows, numCols);
   assert(indexOfRows > 0, 'Indices must be positive.');
   matrixResult(([0:numCols-1]) * numRows + indexOfRows) = 1;
end

x = [2 1 3 1];
m = setRowsToOne(x, 3)

Upvotes: 1

yuk
yuk

Reputation: 19870

One way is to use SUB2IND function:

colN = 3;
assert(max(x)<=colN,'Not enough columns') %# check that you have enough columns
%# other checks that x is valid indices

m = zeros(numel(x),colN);
m(sub2ind(size(m),1:numel(x),x')) = 1;

Upvotes: 3

Related Questions