Franva
Franva

Reputation: 7077

Keras Why binary classification isn't as accurate as categorical calssification

I am trying to create a model which can tell whether there are birds in an image or not.

I was using categorical classification to train the model to recognize Bird v.s. Flowers, the results turned to be very successful in terms of recognizing these 2 classes.

BUT, when I change it to Binary Classification to detect the existence of birds in an images, the accuracy dropped dramatically.

The reason why I changed to use Binary classification is that if I provided a dog to my Categorical Classification trained model, it recognized the dog as a bird.

btw, here is my data set structure:

Training: 5000 images for birds and 2000 images for not-birds

Validating: 1000 images for birds and 500 images for not-birds

enter image description here

Someone said, the inblanced dataset will also cause problems. Is it true?

Could someone please point out where I get wrong in the following code?

def get_num_files(path):
    if not os.path.exists(path):
        return 0
    return sum([len(files) for r, d, files in os.walk(path)])

def get_num_subfolders(path):
    if not os.path.exists(path):
        return 0
    return sum([len(d) for r, d, files in os.walk(path)])

def create_img_generator():
    return ImageDataGenerator(
        preprocessing_function=preprocess_input,
        rotation_range=30,
        width_shift_range=0.2,
        height_shift_range=0.2,
        shear_range=0.2,
        zoom_range=0.2,
        horizontal_flip=True
    )

INIT_LT = 1e-3
Image_width, Image_height = 299, 299
Training_Epochs = 30
Batch_Size = 32
Number_FC_Neurons = 1024
Num_Classes = 2

train_dir = 'to my train folder'
validate_dir = 'to my validation folder'


num_train_samples = get_num_files(train_dir)
num_classes = get_num_subfolders(train_dir)
num_validate_samples = get_num_files(validate_dir)

num_epoch = Training_Epochs
batch_size = Batch_Size

train_image_gen = create_img_generator()
test_image_gen = create_img_generator()

train_generator = train_image_gen.flow_from_directory(
    train_dir,
    target_size=(Image_width, Image_height),
    batch_size = batch_size,
    seed = 42
)

validation_generator = test_image_gen.flow_from_directory(
    validate_dir,
    target_size=(Image_width, Image_height),
    batch_size=batch_size,
    seed=42
)

Inceptionv3_model = InceptionV3(weights='imagenet', include_top=False)
print('Inception v3 model without last FC loaded')

x = Inceptionv3_model.output
x = GlobalAveragePooling2D()(x)
x = Dense(Number_FC_Neurons, activation='relu')(x)
predictions = Dense(num_classes, activation='softmax')(x)

# model = Model(inputs=Inceptionv3_model.input, outputs=predictions)
v3model = Model(inputs=Inceptionv3_model.input, outputs=predictions)
# Use new Sequential model to add v3model and add a bath normalization layer after
model = Sequential()
model.add(v3model)
model.add(BatchNormalization()) # added normalization
print(model.summary())

print('\nFine tuning existing model')

Layers_To_Freeze = 172
for layer in model.layers[:Layers_To_Freeze]:
    layer.trainable = False
for layer in model.layers[Layers_To_Freeze:]:
    layer.trainable = True

optizer = Adam(lr=INIT_LT, decay=INIT_LT / Training_Epochs)
# optizer = SGD(lr=0.0001, momentum=0.9)
model.compile(optimizer=optizer, loss='binary_crossentropy', metrics=['accuracy'])

cbk_early_stopping = EarlyStopping(monitor='val_acc', mode='max')

history_transfer_learning = model.fit_generator(
    train_generator,
    steps_per_epoch = num_train_samples,
    epochs=num_epoch,
    validation_data=validation_generator,
    validation_steps = num_validate_samples,
    class_weight='auto',
    callbacks=[cbk_early_stopping]
)

model.save('incepv3_transfer_mini_binary.h5', overwrite=True, include_optimizer=True)

Upvotes: 0

Views: 716

Answers (1)

Daniel Möller
Daniel Möller

Reputation: 86600

Categorical

  • Use Num_Classes = 2
  • Use one-hot-encoded targets (example: Bird = [1, 0], Flower = [0, 1])
  • Use 'softmax' activation
  • Use 'categorical_crossentropy'

Binary

  • Use Num_Classes = 1
  • Use binary targets (example: is flower = 1 | not flower = 0)
  • Use 'sigmoid' activation
  • Use 'binary_crossentropy'

Details here: Using categorical_crossentropy for only two classes

Upvotes: 3

Related Questions