ptk
ptk

Reputation: 7653

How do I tell if I have successfully frozen or unfrozen a layer in Keras?

How do you know when you've successfully frozen a layer in Keras? Below is a snippet of my model where I am trying to freeze the entire DenseNet121 layer; however, I'm unsure if that is actually occurring since the outputs to the console don't indicate what's happening.

I've tried two methods (1) densenet.trainable = False and (2) model.layers[0].trainable = False.

Furthermore, if I load the model again and add model.layers[0].trainable = True, will this unfreeze the layer?

densenet = DenseNet121(
    weights='/{}'.format(WEIGHTS_FILE_NAME),
    include_top=False,
    input_shape=(IMG_SIZE, IMG_SIZE, 3)
)

model = Sequential()
model.add(densenet)

model.add(layers.GlobalAveragePooling2D())
model.add(layers.Dropout(0.5))
model.add(layers.Dense(NUM_CLASSES, activation='sigmoid'))
model.summary()

# This is how I freeze my layers, I decided to do it twice because I wasn't sure if it was working
densenet.trainable = False
model.layers[0].trainable = False

history = model.fit_generator(
                    datagen.flow(x_train, y_train, batch_size=BATCH_SIZE),
                    steps_per_epoch=len(x_train) / BATCH_SIZE,
                    epochs=NUM_EPOCHS,
                    validation_data=(x_test, y_test),
                    callbacks=callbacks_list,
                    max_queue_size=2
                   )

Below is the output of model.summary(), which I would expect to indicate if a layer has been successfully frozen or not.

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
densenet121 (Model)          (None, 8, 8, 1024)        7037504   
_________________________________________________________________
global_average_pooling2d_3 ( (None, 1024)              0         
_________________________________________________________________
dropout_2 (Dropout)          (None, 1024)              0         
_________________________________________________________________
dense_2 (Dense)              (None, 5)                 5125      
=================================================================
Total params: 7,042,629
Trainable params: 5,125
Non-trainable params: 7,037,504
_________________________________________________________________
Epoch 1/100
354/353 [==============================] - 203s 573ms/step - loss: 0.4374 - acc: 0.8098 - val_loss: 0.3785 - val_acc: 0.8290
val_kappa: 0.0440
Epoch 2/100
354/353 [==============================] - 199s 561ms/step - loss: 0.3738 - acc: 0.8457 - val_loss: 0.3575 - val_acc: 0.8310
val_kappa: 0.0463
Epoch 3/100

Upvotes: 0

Views: 1236

Answers (2)

Kevin M
Kevin M

Reputation: 188

You can find whether a layer is frozen by looking at it's config:

>>> model.get_layer("dense_2").get_config()
{'name': 'dense_2',
 'trainable': True,
...

If trainable is True, it is unfrozen.

Upvotes: 1

Manoj Mohan
Manoj Mohan

Reputation: 6044

however, I'm unsure if that is actually occurring since the outputs to the console don't indicate what's happening.

It does, as can be seen from the number of trainable parameters. As expected, only the parameters(5125) of the last Dense layer are trainable.

Total params: 7,042,629
Trainable params: 5,125
Non-trainable params: 7,037,504

Upvotes: 2

Related Questions