Reputation: 251
I used tensorflow script word2vec_basic.py and I saved the model with tf.summary : saver = tf.train.Saver() save_path = saver.save(sess, "./w2v/model.ckpt")
I visualize the embedding with tensorboard succesfully but I get indexes of words in the vector
How can I get the words in the embedding instead of indexes in the vocabulary
Upvotes: 1
Views: 1139
Reputation: 251
I used this answer: linking-tensorboard-embedding-metadata-to-checkpoint
the problem was I tried o call tensorboard with logdir : "./w2v/model.ckpt" I should called it only with "w2v/"
Upvotes: 0