Tomer Levinboim
Tomer Levinboim

Reputation: 1012

In torch, how can I fix the pre-trained embedding in training time?

I am trying to train an RNN on pre-trained word embeddings. Suppose these pre-trained embedding are kept in a matrix E, which I can use to initialize a LookupTable:

lookupTable = nn.LookupTable(n_words, d)
lookupTable.weight = E

How can I force the model to keep these embeddings fixed during training?

Upvotes: 0

Views: 764

Answers (1)

Fafhrd
Fafhrd

Reputation: 436

Maybe two possibilities :

  1. Force the weights for this layer to stay at 1 at each mini batch iteration

  2. Implement your own lookuptable by extending the nn.LookupTable and override updateOutput so that the weights are not updated.

Upvotes: 1

Related Questions