Is it possible to train model on GPU,then predict on CPU

I want to train my custom model with GPU devices. I am wondering Will clients be able to use it via CPU ?

Upvotes: 3

Views: 3077

Answers (1)

razimbres
razimbres

Reputation: 5015

Yes, you do the heavy job of training on a GPU, save weights and then, your CPU will only do the matrix multiplication for predictions.

In Tensorflow and Keras you can train your model and save Neural Network weights:

Tensorflow:

# ON GPU
with tf.Session() as sess:
  sess.run(init)
  save_path = saver.save(sess, "/tmp/saved_model.ckpt")

# ON CPU
with tf.Session() as sess:
    saver.restore(sess, "/tmp/saved_model.ckpt")

Keras:

model.save_weights('your_model_weights.h5')
model.load_weights('your_model_weights.h5')

With sklearn algorithms, you can save weights this way:

model=XGBClassifier(max_depth=100, learning_rate=0.7, n_estimators=10, objective='binary:logistic',booster='gbtree',n_jobs=16,eval_metric="error",eval_set=eval_set, verbose=True)
clf=model.fit(x_train,y_train)
from sklearn.externals import joblib
joblib.dump(clf, '/path/your_model.joblib')

model = joblib.load('/path/your_model.joblib')
model.predict(X_train)

Upvotes: 5

Related Questions