Ariel
Ariel

Reputation: 211

what does var_list in tf.train.Saver mean?

I check the tensorflow API, If None, the var_list is defaults to the list of all saveable objects. I want to know what does the all saveable objects mean? tf.global_variables? W = tf.get_variable("W", shape=[784, 256], initializer=tf.contrib.layers.xavier_initializer()) b = tf.get_variable("b", shape=[784, 256], initializer=tf.contrib.layers.xavier_initializer()) m = tf.add(W, b) Does m belongs to tf.global_variables? I am really confused. enter image description here Here is my graph, I found train and loss node exists, so do I save the train and loss node? I just want to save model weights.....

Upvotes: 0

Views: 592

Answers (1)

Imtinan Azhar
Imtinan Azhar

Reputation: 1753

The var_list variable says what it does, you may sometimes not want to save the entire model and only a part of it, for these purposes this list can be used, let me elaborate

I am working on face recognition and i trained a CNN that extracts information from face images and gives a 512 size array of encodings that i can then pass to a SVM that can map these embeddings to a name, now at training i need the svm (or a simple ANN) i dont need it during inference, and saving its weights will lead to a larger model size and will consume more GPU memory, so i can while saving decide to store only the CNN variables, not the SVM/ANN ones, hence i will pass names of the layers of the CNN in var_list and not names of the layers of the SVM

Another aspect of saving for inference is that layers like dropout are useless during inference, so its best not to store them at all as they take a chunk of memory

In your case i would suggest dont use it and save the model as it is

Upvotes: 1

Related Questions