Reputation: 1176
I have been trying to carry out transfer learning on a multiclass classification task using resnet as my backbone.
In many tutorials, it was stated that it would be wise to try and train only the last layer (usually a fully connected layer) again, while freezing the other layers. The freezing would be done as so:
for param in model.parameters():
param.requires_grad = False
However, I just realized that all of my layers were actually not freezed, and while checking on my code I realized I had made a typo:
for param in model.parameters():
param.required_grad = False
In a way that I wrote required_grad
instead of requires_grad
.
I can't seem to find information on required_grad
- what it is, nor what it does. The only thing I found out was that it did not change the requires_grad
flag, and that there is a separate required_grad
flag which is set to False instead.
Can anyone explain what required_grad
does? Have I been 'not freezing' my other layers all this time?
Upvotes: 2
Views: 774
Reputation: 1176
Ok, this was really silly.
for param in model.parameters():
param.required_grad = False
In this case, a new 'required_grad' is created due to the typo I made. For example, even the following wouldn't invoke an error:
for param in model.parameters():
param.what_in_the_world = False
And all the parameters of the model would now have a what_in_the_world
attribute.
I hope no one else wastes their time due to this.
Upvotes: 4