Reputation: 1107
I understand that dropout doesn't have the same effect for kernels of convolutional filters of a neural network, as it does for FC layers:
But does the same fact apply, if you dropout the whole filter?
Let's assume a network structure like: Input, Conv2D, Conv2D, ..., Conv2D, Conv2D, Sigmoid. So there is no fully connected Layer in the whole network.
Question 1 Is it reasonable to apply conv filter dropouts to avoid co-adaptation between filters for improving the results of filter visualization.
Question 2 Is there a quick way to do dropout filters in keras.
Upvotes: 1
Views: 505
Reputation: 1107
Answer 1 Maybe.
Without Dropout:
With Dropout:
Answer 2
According to the keras documentations states, use keras.layers.Dropout(rate, noise_shape=None, seed=None)
with noise_shape=(batch_size, 1, 1, features)
. Use 1 if you want the dropout mask to be the same for a complete dimension.
Upvotes: 0