Reputation: 1367
By default, the session saver saves all variables created and that results in very large checkpoint files. I want to save only model parameters and certain session variables, e.g. optimizer state and global step. What are the best practices besides white-listing variables during the saver initialization?
Upvotes: 1
Views: 644
Reputation: 1367
after some investigation (checkpointing with different batch sizes and printing out all_variables
), i found that i was over-concerned. in fact, in tensorflow, results of Op are not saved, e.g. the y
in y = k * x + b
. thus, unlike torch-nn
, you rarely need to worry about non-parameters getting saved.
Upvotes: 1
Reputation: 57973
Saver by default gets variable list from all_variables()
which is all variables from GraphKeys.VARIABLES
collection. You can exclude variable from that collection by by using Variable(..., collections=[]
). Or you could put it as another collection as is done in codebase for non-checkpointed limit_epochs
variable
with ops.name_scope(name, "limit_epochs", [tensor]) as name:
zero64 = constant_op.constant(0, dtype=dtypes.int64)
epochs = variables.Variable(
zero64, name="epochs", trainable=False,
collections=[ops.GraphKeys.LOCAL_VARIABLES])
Upvotes: 2