Frank
Frank

Reputation: 51

change data and get very wired output (simple linear regression)

this is a very simple sample, it is like homework something. code is below:

x_data = np.array([0.35252703, 0.60587817, 0.8906856, 0.4813087, 0.53391305, 0.27751151])
y_data = x_data * 10

b = tf.Variable(0.)
k = tf.Variable(0.)
y = k * x_data + b

loss = tf.reduce_mean(tf.square(y_data - y))
optimizer = tf.train.GradientDescentOptimizer(0.2)
train = optimizer.minimize(loss)
init = tf.global_variables_initializer()

with tf.Session() as sess:
    sess.run(init)
    for step in range(400):
        sess.run(train)
        if (step % 40 == 0):
            print(step, sess.run([k, b]))

this output is:
0 [1.2522789, 2.0945494]
40 [5.304193, 2.5357442]
80 [7.116992, 1.5568293]
120 [8.229965, 0.95582175]
160 [8.913281, 0.58682966]
200 [9.332804, 0.36028674]
240 [9.590374, 0.22119847]
280 [9.748508, 0.13580598]
320 [9.845596, 0.083378375]
360 [9.905204, 0.0511902]
it is pretty good. then i changed data this way:

x_data = np.array([352.52703, 605.87817, 0.8906856, 0.4813087, 0.53391305, 0.27751151])

then output is
0 [327576.72, 640.39246]
40 [nan, nan]
80 [nan, nan]
120 [nan, nan]
160 [nan, nan]
200 [nan, nan]
240 [nan, nan]
280 [nan, nan]
320 [nan, nan]
360 [nan, nan]

anyone can tell me why the second output is this?

Upvotes: 0

Views: 50

Answers (1)

youkaichao
youkaichao

Reputation: 2384

just set the learning rate smaller.

I set the learning rate to be 1e-5 and it works fine.

`

(0, [16.378834, 0.032019623])
(40, [9.9999628, 0.019538468])
(80, [9.9999628, 0.019527739])
(120, [9.9999628, 0.01951701])
(160, [9.9999628, 0.019506281])
(200, [9.9999628, 0.019495552])
(240, [9.9999628, 0.019484824])
(280, [9.9999628, 0.019474095])
(320, [9.9999628, 0.019463366])
(360, [9.9999628, 0.019452637])

`

you initialize the k and b to be 0, the initial gradient is big, and the learning rate is big, so it just goes opposite to the right answer.

Upvotes: 1

Related Questions