Reputation: 11
I want to export HMM model because training it every time takes time. My method is to save all matrices in file. I want to know is there any tensorflow way I can do it? Also is it possible to export it with api to other languages like C++.
Upvotes: 0
Views: 347
Reputation: 1076
tf.saved_model
would be the recommended way to do this. Something like:
import tensorflow as tf
import tensorflow_probability as tfp
hmm = tfp.distributions.HiddenMarkovModel(
initial_distribution=tfp.distributions.Categorical(logits=tf.Variable([0., 0])),
transition_distribution=tfp.distributions.Categorical(logits=tf.Variable([[0., 0]] * 2)),
observation_distribution=tfp.distributions.Normal(tf.Variable([0., 0]),
tfp.util.TransformedVariable([1., 1], tfp.bijectors.Softplus(low=1e-3))),
num_steps=10)
x = hmm.sample(100)
opt = tf.optimizers.Adam(0.01)
@tf.function
def one_step():
with tf.GradientTape() as t:
nll = -hmm.log_prob(x)
grads = t.gradient(nll, hmm.trainable_variables)
opt.apply_gradients(zip(grads, hmm.trainable_variables))
for _ in range(10):
one_step()
class Foo(tf.Module):
def __init__(self, hmm):
self._hmm = hmm
@tf.function(input_signature=[tf.TensorSpec.from_tensor(x)])
def log_prob(self, x):
return self._hmm.log_prob(x)
tf.saved_model.save(Foo(hmm), '/tmp/tf.model')
q = tf.saved_model.load('/tmp/tf.model')
q.log_prob(x)
Upvotes: 0
Reputation: 4893
you can iterate over and save the weights from the model variables by calling variables
attribute of tfp.distributions.HiddenMarkovModel()
Upvotes: 1