Reputation: 733
How can I fix layers of a neural network in tensorflow?
For example in this sample program, say if I know the second layer by training another neural network (say B). Can I use that as a fixed layer and calculate the first layer in the network below?
import tensorflow as tf
x = tf.placeholder(tf.float32, [None, 784])
#layer 1
W1 = tf.Variable(tf.zeros([784, 100]))
b1 = tf.Variable(tf.zeros([100]))
y1 = tf.matmul(x, W1) + b1 #remove softmax
#layer 2
W2 = tf.Variable(tf.zeros([100, 10]))
b2 = tf.Variable(tf.zeros([10]))
y2 = tf.nn.softmax(tf.matmul(y1, W2) + b2)
#output
y = y2
y_ = tf.placeholder(tf.float32, [None, 10])
Upvotes: 1
Views: 103
Reputation: 12908
You can specify certain variables as not trainable, and you can initialize variables with some stored value. Something like:
W2 = tf.Variable(saved_weigths, trainable=False)
b2 = tf.Variable(saved_biases, trainable=False)
y2 = tf.nn.softmax(tf.matmul(y1, W2) + b2)
where saved_weights
and saved_biases
contain your pre-learned weight matrix and bias vector respectively. For reference: Variable docs.
Upvotes: 1