JITENDER SINGH VIRK
JITENDER SINGH VIRK

Reputation: 128

Keras loss is in negative and accuracy is going down, but predictions are good?

I'm training a model in Keras with Tensorflow-gpu backend. Task is to detect buildings in satellite images. loss is going down(which is good) but in negative direction and accuracy is going down. But good part is, model's predictions are improving. My concern is that why loss is in negative. Moreover, why model is improving while accuracy is going down??

from tensorflow.keras.layers import Conv2D
from tensorflow.keras.layers import BatchNormalization
from tensorflow.keras.layers import Activation
from tensorflow.keras.layers import MaxPool2D as MaxPooling2D
from tensorflow.keras.layers import UpSampling2D
from tensorflow.keras.layers import concatenate
from tensorflow.keras.layers import Input
from tensorflow.keras import Model
from tensorflow.keras.optimizers import RMSprop


# LAYERS
inputs = Input(shape=(300, 300, 3))
# 300

down0 = Conv2D(32, (3, 3), padding='same')(inputs)
down0 = BatchNormalization()(down0)
down0 = Activation('relu')(down0)
down0 = Conv2D(32, (3, 3), padding='same')(down0)
down0 = BatchNormalization()(down0)
down0 = Activation('relu')(down0)
down0_pool = MaxPooling2D((2, 2), strides=(2, 2))(down0)
# 150

down1 = Conv2D(64, (3, 3), padding='same')(down0_pool)
down1 = BatchNormalization()(down1)
down1 = Activation('relu')(down1)
down1 = Conv2D(64, (3, 3), padding='same')(down1)
down1 = BatchNormalization()(down1)
down1 = Activation('relu')(down1)
down1_pool = MaxPooling2D((2, 2), strides=(2, 2))(down1)
# 75

center = Conv2D(1024, (3, 3), padding='same')(down1_pool)
center = BatchNormalization()(center)
center = Activation('relu')(center)  
center = Conv2D(1024, (3, 3), padding='same')(center)
center = BatchNormalization()(center)
center = Activation('relu')(center)
# center

up1 = UpSampling2D((2, 2))(center)
up1 = concatenate([down1, up1], axis=3)
up1 = Conv2D(64, (3, 3), padding='same')(up1)
up1 = BatchNormalization()(up1)
up1 = Activation('relu')(up1)
up1 = Conv2D(64, (3, 3), padding='same')(up1)
up1 = BatchNormalization()(up1)
up1 = Activation('relu')(up1)
up1 = Conv2D(64, (3, 3), padding='same')(up1)
up1 = BatchNormalization()(up1)
up1 = Activation('relu')(up1)
# 150

up0 = UpSampling2D((2, 2))(up1)
up0 = concatenate([down0, up0], axis=3)
up0 = Conv2D(32, (3, 3), padding='same')(up0)
up0 = BatchNormalization()(up0)
up0 = Activation('relu')(up0)
up0 = Conv2D(32, (3, 3), padding='same')(up0)
up0 = BatchNormalization()(up0)
up0 = Activation('relu')(up0) 
up0 = Conv2D(32, (3, 3), padding='same')(up0)
up0 = BatchNormalization()(up0)
up0 = Activation('relu')(up0)
# 300x300x3
classify = Conv2D(1, (1, 1), activation='sigmoid')(up0)
# 300x300x1

model = Model(inputs=inputs, outputs=classify)

model.compile(optimizer=RMSprop(lr=0.0001), 
              loss='binary_crossentropy', 
              metrics=[dice_coeff, 'accuracy'])

history = model.fit(sample_input, sample_target, batch_size=4, epochs=5)



OUTPUT:

Epoch 6/10
500/500 [==============================] - 76s 153ms/step - loss: -293.6920 - 
dice_coeff: 1.8607 - acc: 0.2653
Epoch 7/10
500/500 [==============================] - 75s 150ms/step - loss: -309.2504 - 
dice_coeff: 1.8730 - acc: 0.2618
Epoch 8/10
500/500 [==============================] - 75s 150ms/step - loss: -324.4123 - 
dice_coeff: 1.8810 - acc: 0.2659
Epoch 9/10
136/500 [=======>......................] - ETA: 55s - loss: -329.0757 - dice_coeff: 1.8940 - acc: 0.2757

PREDICTED Predicted

ACTUAL TARGET Target

Where is the problem? (leave dice_coeff it's custom loss)

Upvotes: 4

Views: 5533

Answers (1)

Daniel Möller
Daniel Möller

Reputation: 86600

Your output is not normalized for a binary classification. (Data is also probably not normalized).

If you loaded an image, it's probably 0 to 255, or even 0 to 65355.

You should normalize y_train (divide by y_train.max()) and use a 'sigmoid' activation function at the end of your model.

Upvotes: 6

Related Questions