user20697471
user20697471

Reputation: 1

How to get Stochastic Gradient Descent result in contour plot

import numpy as np
from matplotlib import pyplot as plt
xk = np.linspace(-1,1,100)
yk= 2 * xk + 3 + np.random.rand(len(xk))
x1,x2 = np.meshgrid(xk,yk)

F = (x1 - 2) ** 2 + 2 * (x2 - 3) ** 2
fig=plt.figure()
surf = fig.add_subplot(1,1,1, projection='3d')
surf.plot_surface(x1,x2,F)
surf.contour(x1,x2,F)

fig, surf=plt.subplots()
plt.contour(x1, x2, F, 20)

m = 0
c = 0
learning_rate=0.01


I think my problem to get the correct result is from here, but I cant find where is the problem

for k in range(10):  
    shuffel_index=np.random.permutation(len(xk))
    xk = xk[shuffel_index]
    yk = yk[shuffel_index]
    for i in range(len(xk)):
       grad_m = - 2 * xk[i] * (yk[i] - (np.dot(m,xk[i]) + c))
       grad_c = - 2 * (yk[i] - (np.dot(m,xk[i])+c))
       m = m - learning_rate * grad_m
       c = c - learning_rate * grad_c
       surf.plot(np.array([xk[0], yk[0]]),np.array([xk[1], yk[1]]),'ko-')
       if (k != 10 or i != len(xk)):
         surf.plot(np.array([xk[0], yk[0]]),np.array([xk[1], yk[1]]),'ko-')
    plt.show()

This is my result for the above code

And I wish to get the result like I do for gradient descent algorithm. The example of my gradient descent result.

May I know where is my error?

Upvotes: 0

Views: 123

Answers (0)

Related Questions