superticker
superticker

Reputation: 51

Using Matrix and Vector types in Math.Net MultipleRegression

I've declared a MathNet Matrix and Vector type as follows ...

Matrix<double> X = Matrix<double>.Build.Dense(sampleSize,2);
Vector<double> yObserved = Vector<double>.Build.Dense(sampleSize);

but when I call ...

Vector<double> p = MultipleRegression.NormalEquations(X, yObserved, true);

Visual Studio gives the error

Error CS0411 The type arguments for method 'MultipleRegression.NormalEquations(T[][], T[], bool)' cannot be inferred from the usage. Try specifying the type arguments explicitly.

So how am I suppose to call the MultipleRegression class with Matrix and Vector arguments if not like this? And why does Visual Studio find my type coding ambiguous?

I got my code to work fine with a jagged array for the matrix; now I want to get it running with the Matrix/Vector types instead.

Upvotes: 2

Views: 718

Answers (1)

Shelby115
Shelby115

Reputation: 2867

The overload for MultipleRegression.NormalEquations() only has 2 parameters for the Matrix and Vector parameter set combination.

Adding the boolean parameter is confusing it and making it think you're trying to provide the parameters of T[][], T[], bool instead of Matrix, Vector.

I don't know what intercept means but you'll have to look into what it does without it. Either convert your parameters to T[][] and T[] or call it without the boolean (see below).

var p = MultipleRegression.NormalEquations(X, yObserved);

OR

var p = MultipleRegression.NormalEquations<double>(X, yObserved);

Upvotes: 1

Related Questions