SirCheckmatesalot
SirCheckmatesalot

Reputation: 199

Accuracy is 1.0 but there is still some loss

I'm wondering why there is still some loss after the 3th epoch even though the accuracy has reached 100%?

Epoch 1/5 1000/1000 [====================] - 3s 3ms/step - loss: 0.1170 - acc: 0.5384

Epoch 2/5 1000/1000 [====================] - 3s 3ms/step - loss: 0.0626 - acc: 0.8353

Epoch 3/5 1000/1000 [====================] - 3s 3ms/step - loss: 0.0351 - acc: 0.9432

Epoch 4/5 1000/1000 [====================] - 3s 3ms/step - loss: 0.0193 - acc: 1.0000

Epoch 5/5 1000/1000 [====================] - 3s 3ms/step - loss: 0.0146 - acc: 1.0000

Upvotes: 1

Views: 150

Answers (1)

Amir
Amir

Reputation: 16587

To make things clear let's see how we calculate accuracy and loss. Assume you want to do digit recognition on MNIST data. In the output layer, you have a softmax layer which gives a probability to each label. Assume, the input images is 0 and the output prediction of the network is similar to [0.99, 0.002, 0.001, ...., 0.001].

When we want to calculate accuracy we use maximum value in this probability list which means we choose the first index as correct labels. But for calculating loss, we use something functions like MSE which measure how each label misclassified. For the above example, our ground truth label is [1, 0, 0, ..., 0]. By computing MSE between our ground truth and network prediction, a tiny amount still exists. As a result, the loss is not zero.

enter image description here

Upvotes: 1

Related Questions