Reputation: 43
Is it possible to use pre-trained model features from VGG-16 and pass to GlobalAveragePooling2D() layer of other model in Keras?
Sample code for storing offline features of VGG-16 network:
model = applications.VGG16(include_top=False, weights='imagenet')
bottleneck_features_train = model.predict(input)
Sample code for top model:
model = Sequential()
model.add(GlobalAveragePooling2D()) # Here I want to use pre-trained feature from VGG-16 net as input.
I can not use Flatten() layer as I want to predict multi-labels with multi-classes.
Upvotes: 3
Views: 1646
Reputation: 454
Sure, you definitely can. You've got a couple of options:
pooling kwarg
Use the pooling
kwarg in the VGG16 constructor, which replaces the last pooling layer with the specified type. i.e.
model_base = keras.applications.vgg16.VGG16(include_top=False, input_shape=(*IMG_SIZE, 3), weights='imagenet', pooling="avg")
You can also add more layers to the pretrained model:
from keras.models import Model
model_base = keras.applications.vgg16.VGG16(include_top=False, input_shape=(*IMG_SIZE, 3), weights='imagenet')
output = model_base.output
output = GlobalAveragePooling2D()(output)
# Add any other layers you want to `output` here...
model = Model(model_base.input, output)
for layer in model_base.layers:
layer.trainable = False
That last line freezes the pretrained layers so that you preserve the features of the pretrained model and just train the new layers.
I wrote a blog post that goes through the basics of working with pretrained models and extending them to work on various image classification problems; it's also got a link to some working code examples that might provide more context: http://innolitics.com/10x/pretrained-models-with-keras/
Upvotes: 3