Reputation: 431
Let's suppose that the model I'm loading has 4 layers (layer0, layer1, layer2, layer3) for simplicity. If I only wanted that model to be pretrained for say layer0 and layer1, but have randomly initialized parameters for layer2 and layer3, how would I be able to do that?
Upvotes: 0
Views: 496
Reputation: 809
You can do it by freezing the layers you want it not to change(pretrained) and leave the other unfreezed(they will continue to train)
model_ft = models.resnet50(pretrained=True)
ct = 0
for child in model_ft.children():
ct += 1
if ct < 7:
for param in child.parameters():
param.requires_grad = False
This freezes layers 1-6 in the total 10 layers of Resnet50.
Upvotes: 1