narutoArea51
narutoArea51

Reputation: 141

How to use Adam().minimize in tensorflow 2x?

First I disable eager execution Then, I compute my loss function as follows:

def loss_fn(x, y):
    y_ = model(x, training=True)
    loss = tf.reduce_mean(tf.square(y_ - y))
    return loss

My optimizer is:

opt = Adam(1e-3)

Now, I want to minimize the above loss. I wrote the following code:

def train(x, y):
    loss = loss_fn(x, y)
    opt.minimize(loss, var_list=model.trainable_variables)

but I get the following error:

 TypeError: 'Tensor' object is not callable

I decided to try the following:

def train(x, y):
    loss = loss_fn(x, y)
    opt.minimize(lambda: loss, var_list=model.trainable_variables)

But I have also the following error:

ValueError: No gradients provided for any variable: ['dense/kernel:0', 'dense/bias:0', ...]

I looked for some link but I didn't get what I want. Example of link: Tensorflow 2: How can I use AdamOptimizer.minimize() for updating weights

Someone to help me?

Upvotes: 0

Views: 769

Answers (1)

Vishwas Chepuri
Vishwas Chepuri

Reputation: 731

minimize function expects a loss function as a parameter in order to compute gradients using a gradient tape within it. So you may write train function this way,

def train(x, y):
    opt.minimize(lambda : loss_fn(x, y), var_list=model.trainable_variables)

Upvotes: 2

Related Questions