Mohit Kumar
Mohit Kumar

Reputation: 15

Batch normalisation during testing

I am working on a 2d time series problem with vector size 140*6 for binary classification using CNN. I have not used any scaling and normalising techniques instead directly fed data to CNN with 3 hidden layers and Batch Normalisation layers with batch size 256 during training .Since I have to test it at real time as well with batch size 1 how would batch Normalisation work then having not calculated any mean or std deviation for any training layer.And also should batch normalisation later be used for forward pass during final testing or the mean and std deviation only should be calculated for training layers and used.

Upvotes: 0

Views: 2246

Answers (2)

swageta
swageta

Reputation: 305

This questions has been asked two years ago but I don't think the accepted answer is correct! Batch Normalization IS used during testing (at least you keep the batch normalisation LAYERS), but you normalize the feature activations with the training data's running averages of mean and variance (which you store during your training process). So it is not actual batch normalisation during testing but rather a linear transformation with the saved training statistics. Therefore, if you are testing with batch size of 1 you would just use the saved running averages of the training data.

The following thread answers the question: Batch normalization during testing

Upvotes: 1

Abhishek Verma
Abhishek Verma

Reputation: 1729

Batch normalization is not used during testing. The reason for that being is batch normalization is used to alleviate the problem of covariance shift between different batches in training data. The covariance shift leads to bad models getting trained, thus, we use it. It has no role to play during testing.

And if you have used batch normalization with batch size 1, then, that is simply instance normalization.

Upvotes: 1

Related Questions