user3009734
user3009734

Reputation: 93

Analogies from Word2Vec in TensorFlow?

I implemented Word Embeddings in Tensor Flow similarly to the code here I was able to get the final embeddings (final_embeddings), but I would like to evaluate the embeddings using the analogies typical of this exercise. How can I identify which term corresponds to which row in the final embeddings array? Alternatively, is there an implementation in Tensor Flow for this? Any help would be greatly appreciated (specifics and resources would be a plus;) ). Thanks!

Upvotes: 2

Views: 314

Answers (2)

rmeertens
rmeertens

Reputation: 4451

Which term corresponds to which row in the final embeddings array is completely up to your implementation. At some point before your training you converted each word to a number, right? This number indicates that row in your embeddings table.

If you want to know the specific name, you could post part of your code here.

Upvotes: 0

Qy Zuo
Qy Zuo

Reputation: 2732

Recommend this conceptual tutorial to you. If you are using skip-gram, the input is one-hot encoding. So the index of the 1 is the index of the vector of the word.

enter image description here

The implementation in tensorflow is quite simple. You may want to see this function: tf.nn.embedding_lookup

For example:

embed = tf.nn.embedding_lookup(embedding, inputs)

The embed is the vector you are looking for.

Upvotes: 1

Related Questions