Suzana Ilić
Suzana Ilić

Reputation: 73

Save TensorFlow model for Tensorflow.js

I trained a chatbot in TensorFlow and would like to save the model in order to deploy it with TensorFlow.js to web. I have the following

checkpoint = "./chatbot_weights.ckpt"
session = tf.InteractiveSession()
session.run(tf.global_variables_initializer())
saver = tf.train.Saver()
saver.restore(session, checkpoint)


# Converting the questions from strings to lists of encoding integers
def convert_string2int(question, word2int):
    question = clean_text(question)
    return [word2int.get(word, word2int['<OUT>']) for word in question.split()]

# Setting up the chat
while(True):
    question = input("You: ")
    if question == 'Goodbye':
        break
    question = convert_string2int(question, questionswords2int)
    question = question + [questionswords2int['<PAD>']] * (25 - len(question))
    fake_batch = np.zeros((batch_size, 25))
    fake_batch[0] = question
    predicted_answer = session.run(test_predictions, {inputs: fake_batch, keep_prob: 0.5})[0]
    answer = ''
    for i in np.argmax(predicted_answer, 1):
        if answersints2word[i] == 'i':
            token = ' I'
        elif answersints2word[i] == '<EOS>':
            token = '.'
        elif answersints2word[i] == '<OUT>':
            token = 'out'
        else:
            token = ' ' + answersints2word[i]
        answer += token
        if token == '.':
            break
    print('ChatBot: ' + answer)

and it gives the following files (and I can test the bot in the console):

saved files

But in the documentation it says I should use SaveModel or Frozen, can anyone help here? I'm not sure how to implement. Thanks. https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md

Upvotes: 3

Views: 1178

Answers (1)

edkeveked
edkeveked

Reputation: 18401

To deploy your model in the browser, you first need to convert it using tfjs-converter. You can have a look at the following tutorial to see how to proceed.

For the model to be successfully converted, all the ops used in your model should be already supported in the browser. Here is the full list of the ops that are currently supported.

Once your model is converted and that you have the files of the model and the weight, you can load it using loadFrozenModel:

const model = await loadFrozenModel(MODEL_URL, WEIGHTS_URL);
...
model.execute({input: the_input_of_the_model});

Upvotes: 1

Related Questions