Reputation: 3974
Suppose there's a model defined as class like this:
class SimpleAutoencoder(object):
def __init__(self, x):
self.x = x
self.input_dim = 92
self.latent_dim = 10
self.build_model()
def build_model(self):
latent = tf.contrib.layers.fully_connected(self.x,
self.latent_dim,
scope='latent',
activation_fn=tf.nn.relu)
self.x_hat = tf.contrib.layers.fully_connected(latent,
self.input_dim,
scope='output',
activation_fn=tf.nn.sigmoid)
self.loss = tf.losses.mean_squared_error(self.x, self.x_hat)
self.train_op = tf.AdamOptimizer().minimize(self.loss)
You train it using an input pipeline for feeding your data:
...
x = iterator.get_next()
model = SimpleAutoencoder(x)
...
## train and save it to disk
Now, when using a placeholder for self.x
while building the model, I can give it a name and access the input variable easily when I restore the model to do inference. But with the input pipeline, x
is no variable, constant or placeholder and therefore I cannot give it a proper name. How can I inject new data into x
and feed it through the graph?
Even though the training works, I have considered that I might do it wrong somehow, as the code looks really ugly to me (the part with giving the pipeline output to the init function).
Please help me with this one! Thank you!
Upvotes: 0
Views: 50
Reputation: 24641
x
's name using x.name
,or, you could rename x
to a name to your taste using x = tf.identity(x, name='my_name')
,
(With these two solutions, you can feed your values with the name of the tensor -- even if x
is not a placeholder:
sess.run(my_ops, feed_dict{tensor_name: tensor_value})
)
or, your could replace the entire input pipeline with a placeholder (explained here for the opposite problem — replacing a placeholder with a Dataset
input)
Upvotes: 1