Jeena KK
Jeena KK

Reputation: 183

what is the default weight initialization used in Pytorch embedding layer?

When we create an embedding layer using the class torch.nn.Embedding, how are the weights initialized ? Is uniform, normal or initialization techniques like He or Xavier used by default?

Upvotes: 4

Views: 2603

Answers (1)

Harshit Kumar
Harshit Kumar

Reputation: 12857

In Embedding, by default, the weights are initialization from the Normal distribution. You can check it from the reset_parameters() method:

def reset_parameters(self):
        init.normal_(self.weight)
        ...

Upvotes: 5

Related Questions