Reputation: 5148
I'm new with Matlab and Machine Learning and I tried to compute a cost function for a gradient descent.
The function computeCost takes 3 arguments:
I already have a solution using matrix multiplication
function J = computeCost(X, y, theta)
m = length(Y);
h = X * theta;
sError = (h - y) .^ 2;
J = sum(sError) / (2 * m);
end
But now, I tried to do the same without matrix multiplication
function J = computeCost(X, y, theta)
m = length(Y);
S = 0;
for i = 1:m
h = X(i, 1) + theta(2) * X(i, 2);
S = S + ((h - y(i)) ^ 2);
end
J = (1/2*m) * S;
end
But I didn't get the same result, and first is good for sure (I already use it before).
Upvotes: 2
Views: 2054
Reputation: 104484
You have two slight (but fundamental) errors - they are pretty simple errors though that certainly can be overlooked.
You forgot to include the bias term in your hypothesis:
h = X(i,1)*theta(1) + X(i,2)*theta(2);
%// ^^^^^^
Remember, the hypothesis when it comes to linear regression is equal to theta^{T}*x
. You didn't include all of the theta
terms - only the second term.
Your last statement to normalize by (2*m)
is slightly off. You currently have it as:
J = (1/2*m) * S;
Because multiplication and division have the same rules of operation, this is the same as (1/2)*m
and that's not what you want. Just make sure that the (2*m)
has brackets surrounding itself to ensure that this is evaluated as S / (2*m)
:
J = (1/(2*m)) * S;
This will ensure that 2*m
is evaluated first, then the reciprocal of that is taken with the multiplication of the sum of squared errors.
When you fix those problems, you will get the same results as using the matrix formulation.
BTW, slight typo in your code. It should be m = length(y)
, not m = length(Y)
.
Upvotes: 2