Reputation: 131
I implement transfer learning using VGG16 to classify the diabetic retinopathy disease binary classification. Even after balancing the classes my model predicts only the single class. why does this happens. Below is my code
base_model=VGG16(weights='imagenet',include_top=False) #imports the mobilenet model and discards the last 1000 neuron layer.
x=base_model.output
x=GlobalAveragePooling2D()(x)
x=Dense(1024,activation='relu')(x) #we add dense layers so that the model can learn more complex functions and classify for better results.
x=Dense(1024,activation='relu')(x) #dense layer 2
x=Dense(512,activation='relu')(x) #dense layer 3
preds=Dense(1,activation='softmax')(x) #final layer with softmax activation
vgg=Model(inputs=base_model.input,outputs=preds)
Upvotes: 0
Views: 148
Reputation: 678
It seems like you're using a softmax activation function on your output. Softmax is typically used when you are classifying an input with multiple possible classes, as it outputs a probability distribution (i.e. all elements sum to 1). It does so by first exponentiating each element, and then dividing each by the sum of all elements.
However, if you only have one output unit, it will then have to always output 1, as it will be calculating exp(x_1) / exp(x_1) = 1
For a binary classification task as you're doing, I would recommend using a sigmoid output activation function instead:
base_model=VGG16(weights='imagenet',include_top=False) #imports the mobilenet model and discards the last 1000 neuron layer.
x=base_model.output
x=GlobalAveragePooling2D()(x)
x=Dense(1024,activation='relu')(x) #we add dense layers so that the model can learn more complex functions and classify for better results.
x=Dense(1024,activation='relu')(x) #dense layer 2
x=Dense(512,activation='relu')(x) #dense layer 3
preds=Dense(1,activation='sigmoid')(x) #final layer with softmax activation
vgg=Model(inputs=base_model.input,outputs=preds)
This assumes that the labels in your training dataset are 0 and 1.
Upvotes: 1