Long Nguyen
Long Nguyen

Reputation: 11

TensorFlow 2.0: Using GradientTape to do manual updates of parameters

I am trying to do a simple one layer neural network using GradientTape from TensorFlow 2.0 and update all the parameters manually but it doesn't seem to work.

Here's one iteration of the training loop:

W = tf.Variable(tf.random.normal([784,10], dtype = tf.float64, stddev=1))
b = tf.Variable(tf.random.normal([10], dtype = tf.float64))

X = x_train[0:mini_batch_size]
Y = y_train[0:mini_batch_size]

with tf.GradientTape() as tape:
    Y_pred = tf.sigmoid(tf.matmul(X,W)+b)
    loss = tf.reduce_mean(tf.reduce_sum((Y-Y_pred)**2, axis = 1))
dW, db = tape.gradient(loss, [W,b])

If I print out dW, it's just all zeroes. And the manual update W = W - 1.0 * dW gives the unsupported operand type(s) for *: 'float' and 'NoneType' error.

Upvotes: 1

Views: 313

Answers (0)

Related Questions