Reputation: 3409
I have trained a tf linear regression estimator as below:
sample_size = train_x.shape[0]
feature_size = train_x.shape[1]
feature_columns = [tf.feature_column.numeric_column("x", shape=[feature_size])]
lr_estimator = tf.estimator.LinearRegressor(feature_columns=feature_columns )
train_x_mat = train_x.as_matrix()
test_x_mat = test_x.as_matrix()
# Define the training inputs
train_input_fn = tf.estimator.inputs.numpy_input_fn(
x={"x": train_x_mat},
y=np.array(train_y_mat),
num_epochs=None,
shuffle=True)
# Train model.
lr_estimator.train(input_fn=train_input_fn, steps=2000)
where train_x and train_y are pandas dataframe. the lr_estimator does work and I can call .predict successfully.
How I can save it to a file, and then load it back for prediction later? I am only trying to build a small python program. The prediction program will be run on the same desktop. I don't need complicated server serving, yet.
Upvotes: 2
Views: 650
Reputation: 21
def serving_input_receiver_fn():
"""
input placeholder
"""
inputs = {"x": tf.placeholder(shape=[feature_size], dtype=tf.float32)}
return tf.estimator.export.ServingInputReceiver(inputs, inputs)
# export model and weights
export_dir = est_inception_v3.export_savedmodel(export_dir_base="/export_dir",
serving_input_receiver_fn=serving_input_receiver_fn)
# restore from disk
with tf.Session() as sess:
tf.saved_model.loader.load(sess, [tf.saved_model.tag_constants.SERVING], export_dir)
predictor = SavedModelPredictor(export_dir)
print(predictor({"x": test_x_mat}))
Upvotes: 2