Maz
Maz

Reputation: 91

Variable update issue in tensor flow

My question is why in the simple code below, the value of some of variables(w_3 for instance) does not update but for others they get updated.

import tensorflow as tf
import numpy as np
x=([1, 2, 3])
x= np.array(x)

sess= tf.InteractiveSession()

input_data= tf.placeholder(dtype= 'float32', shape= (None))

w_1= tf.Variable(tf.truncated_normal([1], stddev= 0.01), trainable= True, name='w_1')

w_2= tf.Variable(tf.truncated_normal([1], stddev= 0.01), trainable= True, name='w_2')

w_3= tf.Variable(tf.truncated_normal([1], stddev= 0.01), trainable= True, name='w_3')

loss= tf.pow(w_1, 2)- input_data+ tf.pow(w_2, 2)+ tf.pow(w_1, 2)

optimizer = tf.train.GradientDescentOptimizer(learning_rate= 0.01)

train_op = optimizer.minimize(loss)

init= tf.global_variables_initializer()

sess.run(init)

for j in range(0,4):

    for i in range(0,3):
        sess.run(train_op, feed_dict={input_data: x[i]})
        print('w1:',sess.run(w_1, feed_dict={input_data: x[i]}))
        print('w2:',sess.run(w_2, feed_dict={input_data: x[i]}))
        print('w3:',sess.run(w_3, feed_dict={input_data: x[i]}))

Upvotes: 1

Views: 49

Answers (1)

ted
ted

Reputation: 14724

This is expected: your w_3 variable is not involved in your loss calculation. Therefore the gradient does not depend on it and the w_3 variable is not updated!

Maybe you meant to use w_3 and made a simple yet typical typo!

Upvotes: 1

Related Questions