I. A
I. A

Reputation: 2312

Using different optimizers to train the same layer in tensorflow

I have a model which consists of convolutional layers followed by fully connected layers. I trained this model on the fer dataset. This is considered a classification problem where the number of output is equal to 8.

After training this model, I kept the fully connected layer, and replaced only the last layer with a new one that has 3 outputs. Therefore, the purpose was to fine tune the fully connected layers along with training the output layer.

Therefore, I have used an optimizer at the beginning to train the whole model. Then I created a new optimizer to fine tune the fully connected layer along with training the last layer.

As a result, I got the following error:

ValueError: Variable Dense/dense/bias/Adam/ already exists,

I know the reason for getting error. The second optimizer was trying to create a kernel for updating the weights using the same name; because a kernel with the same name was created by the first optimizer.

Hence, I would like to know how to fix this problem. Is there a way to delete the kernels associated with the first optimizer?

Any help is much appreciated!!

Upvotes: 2

Views: 412

Answers (1)

xdurch0
xdurch0

Reputation: 10474

This is probably caused by both optimizers using the (same) default name 'Adam'. To avoid this clash, you can give the second optimizer a different name, e.g.

opt_finetune = tf.train.AdamOptimizer(name='Adam_finetune')

This should make opt_finetune create its variables under different names. Please let us know whether this works!

Upvotes: 3

Related Questions