David
David

Reputation: 139

What does 'requires grad' do in PyTorch and should I use it?

I have a network where I need to add my own parameter that I want to be trainable. I am using nn.Parameter() to add this but there's a 'Requires Grad' argument and I can't really make sense as to whether I want this to be true or false from reading the documentation. It makes sense that I would set this to be true, because I want this parameter to be optimised as part of the learning process -- but the need for this argument confuses me: if False means it is not optimised as part of the training process, then why would you use nn.Parameter() rather than just using a normal Tensor?

From the documentation I see that it adds the parameter to the list of utterable parameters that you obtain from the model, but I don't see why you would want this if you're not optimising it.

Upvotes: 3

Views: 4580

Answers (1)

Mohammadreza Mohseni
Mohammadreza Mohseni

Reputation: 106

As far as I know, sometimes you might need to freeze/unfreeze some part of your neural network and avoid/let some of the parameters to be optimized during the training. "requires_grad" argument provides an easy way to include or exclude your network's parameters in the backpropagation phase. You just set it to True or False and it's done.

Upvotes: 9

Related Questions