Ayush Pandey
Ayush Pandey

Reputation: 587

How to save a tensorflow model (omitting the labels tensor) with no variables defined

My tensorflow model is defined as follows:

X =  tf.placeholder(tf.float32, [None,training_set.shape[1]],name = 'X')
Y = tf.placeholder(tf.float32,[None,training_labels.shape[1]], name = 'Y')
A1 = tf.contrib.layers.fully_connected(X, num_outputs = 50, activation_fn = tf.nn.relu)
A1 = tf.nn.dropout(A1, 0.8)
A2 = tf.contrib.layers.fully_connected(A1, num_outputs = 2, activation_fn = None)
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits = A2, labels = Y))    
global_step = tf.Variable(0, trainable=False)
start_learning_rate = 0.001
learning_rate = tf.train.exponential_decay(start_learning_rate, global_step, 200, 0.1, True )
optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost)

Now I want to save this model omitting tensor Y (Y is the label tensor for training, X is the actual input). Also while mentioning the output node while using freeze_graph.py should I mention "A2" or is it saved with some other name?

Upvotes: 2

Views: 595

Answers (1)

Maxim
Maxim

Reputation: 53768

Although you haven't defined the variables manually, the code snippet above actually contains 15 saveable variables. You can see them using this internal tensorflow function:

from tensorflow.python.ops.variables import _all_saveable_objects
for obj in _all_saveable_objects():
  print(obj)

For the code above, it produces the following list:

<tf.Variable 'fully_connected/weights:0' shape=(100, 50) dtype=float32_ref>
<tf.Variable 'fully_connected/biases:0' shape=(50,) dtype=float32_ref>
<tf.Variable 'fully_connected_1/weights:0' shape=(50, 2) dtype=float32_ref>
<tf.Variable 'fully_connected_1/biases:0' shape=(2,) dtype=float32_ref>
<tf.Variable 'Variable:0' shape=() dtype=int32_ref>
<tf.Variable 'beta1_power:0' shape=() dtype=float32_ref>
<tf.Variable 'beta2_power:0' shape=() dtype=float32_ref>
<tf.Variable 'fully_connected/weights/Adam:0' shape=(100, 50) dtype=float32_ref>
<tf.Variable 'fully_connected/weights/Adam_1:0' shape=(100, 50) dtype=float32_ref>
<tf.Variable 'fully_connected/biases/Adam:0' shape=(50,) dtype=float32_ref>
<tf.Variable 'fully_connected/biases/Adam_1:0' shape=(50,) dtype=float32_ref>
<tf.Variable 'fully_connected_1/weights/Adam:0' shape=(50, 2) dtype=float32_ref>
<tf.Variable 'fully_connected_1/weights/Adam_1:0' shape=(50, 2) dtype=float32_ref>
<tf.Variable 'fully_connected_1/biases/Adam:0' shape=(2,) dtype=float32_ref>
<tf.Variable 'fully_connected_1/biases/Adam_1:0' shape=(2,) dtype=float32_ref>

There are variables from both fully_connected layers and several more coming from Adam optimizer (see this question). Note there're no X and Y placeholders in this list, so no need to exclude them. Of course, these tensors exist in the meta graph, but they don't have any value, hence not saveable.

The _all_saveable_objects() list is what tensorflow saver saves by default, if the variables are not provided explicitly. Hence, the answer to your main question is simple:

saver = tf.train.Saver()  # all saveable objects!
with tf.Session() as sess:
  tf.global_variables_initializer().run()
  saver.save(sess, "...")

There's no way to provide the name for the tf.contrib.layers.fully_connected function (as a result, it's saved as fully_connected_1/...), but you're encouraged to switch to tf.layers.dense, wich has a name argument. To see why it's a good idea anyway, take a look at this and this discussion.

Upvotes: 3

Related Questions