Reputation: 777
I understand that we can write custom models and encapsulate it using tf.estimator. But I just can't seem to find any documentation with an example.
I know that you have to define your model inside a 'model_fn' but what exactly should I return from this function. Also am I supposed to put the the loss and the training step within the 'model_fn' or just the network. How should I modify the code give below to make it work with tf.estimator. Would really appreciate some help.
def test_model(features,labels):
X = tf.placeholder(tf.float32,shape=(None,1),name="Data_Input")
#Output
Y = tf.placeholder(tf.float32,shape=(None,1),name="Target_Labels")
W = tf.Variable(tf.random_normal([0],stddev=stddev0))
b = tf.Variable(tf.random_normal([0],stddev=stddev0))
Ypredict = W*X + b
return Ypredict
estimator = tf.estimator.Estimator(model_fn = test_model)
Upvotes: 2
Views: 492
Reputation: 805
You should return a tf.estimator.EstimatorSpec
object. Something to the effect of:
def model_fn(features, labels, mode, params):
/*
Your marvelous model
*/
loss = tf.losses.softmax_cross_entropy(onehot_labels=labels_onehot, logits=logits)
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.001)
train_op = optimizer.minimize(loss=loss, global_step=tf.train.get_global_step())
return tf.estimator.EstimatorSpec(mode=mode, loss=loss, train_op=train_op)
There's more to it, so for a better walkthrough, see here.
Upvotes: 1