Reputation: 1
Is it possible to load a pre-trained model, say ResNet152, and then to freeze certain weights within particular layers and fine-tune the others within those particular layer? For example:
Say, freeze 50% of filters in a particular conv layer, and fine-tune the remaining 50% in that same particular conv layer?
I can't seem to think of a solid way to implement this. I thought perhaps load the pre-trained model twice, freeze the one and allow the other to fine-tune, but then I'd need to remove filters from both branches.
Upvotes: 0
Views: 217
Reputation: 640
You can freeze the layers using layer.trainable = False
(in Tensorflow or Keras). After you train your network and want to finetune it. You can use something like the following to freeze some of your weights:
for layer in model.layers[:5]: # freeze first 5 layers
layer.trainable = False
Upvotes: 0