Reputation: 21
Implement the gradient descent algorithm in this question. Let {X1,…,Xn} be a dataset and g(x)=n−1∑ni=1(x−Xi)2. It is known that the mean of the dataset is the solution to the following minimization problem minx∈ℝg(x).
To minimize g(x), you are going to use a while loop to implement the gradient descent algorithm, as follows.
Step 0. Initialize x1=0 Step 1. In the kth step, where k=1,2,…, set xk+1=xk−0.99k×g′(xk).
Step 2. Repeat Step 1 until |g′(xk)| is smaller than a small tolerance level tol (e.g., set it to 1e-5) or if k exceeds the maximum number of iterations Kmax (e.g., set it to 1000).
You are going to implement the gradient descent algorithm to find the mean. Use the dataset cars$speed for {X1,…,Xn}.You don’t have to write the algorithm into a function in this question; you are going to do this in the next.
Could someone help me with this?
Here is what I have so far
data(cars)
x1 <- 0
k <- 1
toleranceLevel <-0.00005
X <- cars$speed
kmax <- 10000
while(x1 > toleranceLevel){
gxprime <- 2 * mean(x1 - X)
gxprime
x1 <-(((x1)-(.99^k))*gxprime)
if(x1 < toleranceLevel){
k <- k + 1
} else {
}
if(k == kmax){
break
}
print(k)
}
Upvotes: 1
Views: 352
Reputation: 79298
data(cars)
x_old <- 0
k <- 1
toleranceLevel <-0.00005
X <- cars$speed
kmax <- 10000
err <- 1
while(err > toleranceLevel & k < kmax){
x_new <- x_old -.99^k * 2 * mean(x_old - X)
err <- abs(x_new - x_old)
x_old <- x_new
k <- k + 1
}
x_new
Upvotes: 1