Reputation: 183
When we create an embedding layer using the class torch.nn.Embedding
, how are the weights initialized ? Is uniform, normal or initialization techniques like He or Xavier used by default?
Upvotes: 4
Views: 2603
Reputation: 12857
In Embedding
, by default, the weights are initialization from the Normal distribution. You can check it from the reset_parameters()
method:
def reset_parameters(self):
init.normal_(self.weight)
...
Upvotes: 5