Blacky_99
Blacky_99

Reputation: 185

Why is batch normalization reducing my model training accuracy?

Before i added batch normalization layer my model had a training accuracy of 97% but once i added batch normalizing layer my accuracy went to down to 70%. Why is that happening and any ways to improve the accuracy while having batch normalization ?

model = Sequential()
    model.add(Conv2D(32, (3, 3), input_shape=input_shape))
    model.add(Activation('relu'))
    model.add(BatchNormalization())
    model.add(MaxPooling2D(pool_size=(2, 2)))
    
    #first hidden layer 
    model.add(Conv2D(32, (3, 3)))
    model.add(Activation('relu'))
    model.add(BatchNormalization())
    model.add(MaxPooling2D(pool_size=(2, 2)))
    
    #output layer
    model.add(Flatten())
    model.add(Dense(64))
    model.add(Activation('relu'))
    model.add(Dropout(0.5))
    model.add(Dense(1))
    model.add(Activation('sigmoid'))
    model.summary()
    
    model_checkpoint = ModelCheckpoint( "model.hdf5", verbose=1, save_best_only=True)
    model.compile(loss='binary_crossentropy',
                  optimizer= 'adam',
                  metrics=['accuracy'])

Upvotes: 1

Views: 1755

Answers (1)

s510
s510

Reputation: 2812

Batch Normalisation doesnt guarantee that your performance will increase. But it does work well in some cases.

One of things you can try to do is:

  1. Increase the batch size of the training. This will give a more appropiate mean and standard deviation for normalisation.

  2. Play around with the BN parameters, specifically the momentum parameter. See more here about the params https://keras.io/api/layers/normalization_layers/batch_normalization/ I would suggest to decrease the momentum and try again.

  3. If it still doesnt work, leave it out.

Upvotes: 3

Related Questions