xtr33me
xtr33me

Reputation: 1106

ValueError: Input 0 of node Variable/Assign was passed int32 from Variable:0 incompatible with expected int32_ref

I am currently trying to get a trained TF seq2seq model working with Tensorflow.js. I need to get the json files for this. My input is a few sentences and the output is "embeddings". This model is working when I read in the checkpoint however I can't get it converted for tf.js. Part of the process for conversion is to get my latest checkpoint frozen as a protobuf (pb) file and then convert that to the json formats expected by tensorflow.js.

The above is my understanding and being that I haven't done this before, it may be wrong so please feel free to correct if I'm wrong in what I have deduced from reading.

When I try to convert to the tensorflow.js format I use the following command:

sudo tensorflowjs_converter --input_format=tf_frozen_model 
--output_node_names='embeddings' 
--saved_model_tags=serve 
./saved_model/model.pb /web_model

This then displays the error listed in this post:

ValueError: Input 0 of node Variable/Assign was passed int32 from Variable:0 incompatible with expected int32_ref.

One of the problems I'm running into is that I'm really not even sure how to troubleshoot this. So I was hoping that perhaps one of you maybe had some guidance or maybe you know what my issue may be.

I have upped the code I used to convert the checkpoint file to protobuf at the link below. I then added to the bottom of the notebook an import of that file that is then providing the same error I get when trying to convert to tensorflowjs format. (Just scroll to the bottom of the notebook)

https://github.com/xtr33me/textsumToTfjs/blob/master/convert_ckpt_to_pb.ipynb

Any help would be greatly appreciated!

Upvotes: 0

Views: 1315

Answers (1)

xtr33me
xtr33me

Reputation: 1106

Still unsure as to why I was getting the above error, however in the end I was able to resolve this issue by just switching over to using TF's SavedModel via tf.saved_model. A rough example of what worked for me can be found below should anyone in the future run into something similar. After saving out the below model, I was then able to perform the tensorflowjs_convert call on it and export the correct files.

if first_iter == True: #first time through
    first_iter = False
    #Lets try saving this badboy
    cwd = os.getcwd()
    path = os.path.join(cwd, 'simple')
    shutil.rmtree(path, ignore_errors=True)

    inputs_dict = {
        "batch_decoder_input": tf.convert_to_tensor(batch_decoder_input)
    }
    outputs_dict = {
        "batch_decoder_output": tf.convert_to_tensor(batch_decoder_output)
    }

    tf.saved_model.simple_save(
        sess, path, inputs_dict, outputs_dict
    )
    print('Model Saved')
    #End save model code

Upvotes: 0

Related Questions