Reputation: 1651
If I want to implement a classifier using the sklearn
library. Is there a way to save the model or convert the file into a saved tensorflow
file in order to convert it to tensorflow lite
later?
Upvotes: 14
Views: 10963
Reputation: 11
It is not always easy to replicate a scikit model in tensorflow. For instance scitik has a lot of on the fly imputation libraries which will be a bit tricky to implement in tensorflow
Upvotes: 1
Reputation: 11240
If you replicate the architecture in TensorFlow, which will be pretty easy given that scikit-learn models are usually rather simple, you can explicitly assign the parameters from the learned scikit-learn models to TensorFlow layers.
Here is an example with logistic regression turned into a single dense layer:
import tensorflow as tf
import numpy as np
from sklearn.linear_model import LogisticRegression
# some random data to train and test on
x = np.random.normal(size=(60, 21))
y = np.random.uniform(size=(60,)) > 0.5
# fit the sklearn model on the data
sklearn_model = LogisticRegression().fit(x, y)
# create a TF model with the same architecture
tf_model = tf.keras.models.Sequential()
tf_model.add(tf.keras.Input(shape=(21,)))
tf_model.add(tf.keras.layers.Dense(1))
# assign the parameters from sklearn to the TF model
tf_model.layers[0].weights[0].assign(sklearn_model.coef_.transpose())
tf_model.layers[0].bias.assign(sklearn_model.intercept_)
# verify the models do the same prediction
assert np.all((tf_model(x) > 0)[:, 0].numpy() == sklearn_model.predict(x))
Upvotes: 6