GoingMyWay
GoingMyWay

Reputation: 17468

How to fine-tune weights in specific layers in TensorFlow?

I'm trying to implement Progressive Neural Networks and in this paper, the author applied transfer learning to exploit previously learned knowledge to train current reinforcement learning agents. 2 Questions:

  1. How can I lock certain layers so that the weights and biases of these layers can't be updated?
  2. And how can I only train specific layers during training?

Here is my code:

def __create_network(self):
    with tf.variable_scope('inputs'):
        self.inputs = tf.placeholder(shape=[-1, 80, 80, 4], dtype=tf.float32, name='input_data')

    with tf.variable_scope('networks'):
        with tf.variable_scope('conv_1'):
            self.conv_1 = slim.conv2d(activation_fn=tf.nn.relu, inputs=self.inputs, num_outputs=32,
                                      kernel_size=[8, 8], stride=4, padding='SAME')

        with tf.variable_scope('conv_2'):
            self.conv_2 = slim.conv2d(activation_fn=tf.nn.relu, inputs=self.conv_1, num_outputs=64,
                                      kernel_size=[4, 4], stride=2, padding='SAME')

        with tf.variable_scope('conv_3'):
            self.conv_3 = slim.conv2d(activation_fn=tf.nn.relu, inputs=self.conv_2, num_outputs=64,
                                      kernel_size=[3, 3], stride=1, padding='SAME')

        with tf.variable_scope('fc'):
            self.fc = slim.fully_connected(slim.flatten(self.conv_3), 512, activation_fn=tf.nn.elu)

I want to lock conv_1, conv_2 and conv_3 and only train fc after restoring checkpoint data.

Upvotes: 1

Views: 2072

Answers (1)

jkschin
jkschin

Reputation: 5844

To lock certain variables it is slightly complicated and there are a few ways to do it. This post covers it and is quite similar to your question.

The easy way out would be to do the following:

fc_vars = tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES, scope='fc')
train_op = opt.minimize(loss, var_list=fc_vars)

Upvotes: 1

Related Questions