Reputation: 29
I want to freeze the layers except the first three layers in the Inception v3 model with TensorFlow
in Python 3
also modify the weights of these three layers to be able to re-initialize and re-train only the three first layers of the network. If this can't be done within the inception model, is there any other network (in TensorFlow
) with which this could be done?
Upvotes: 1
Views: 1456
Reputation: 53768
This can be done in any network. To freeze the lower layers during training, the simplest solution is to give the optimizer the list of variables to train, excluding the variables from the lower layers:
train_vars = tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES,
scope="hidden[34]|outputs")
training_op = optimizer.minimize(loss, var_list=train_vars)
The first line gets the list of all trainable variables in hidden layers 3 and 4 and in the output layer. This leaves out the variables in the hidden layers 1 and 2. The snippet above assumes the layers have the variable scopes hidden1
, ... hidden4
and outputs
; Inception model uses a different naming: Conv2d_2a_*
, Conv2d_2b_*
, AvgPool_1a_*
, ...
Upvotes: 2