Reputation: 1012
I am trying to train an RNN on pre-trained word embeddings. Suppose these pre-trained embedding are kept in a matrix E, which I can use to initialize a LookupTable:
lookupTable = nn.LookupTable(n_words, d)
lookupTable.weight = E
How can I force the model to keep these embeddings fixed during training?
Upvotes: 0
Views: 764
Reputation: 436
Maybe two possibilities :
Force the weights for this layer to stay at 1 at each mini batch iteration
Implement your own lookuptable by extending the nn.LookupTable and override updateOutput so that the weights are not updated.
Upvotes: 1