Kyr Kalash
Kyr Kalash

Reputation: 3

Wrong model prediction

I have a binary classification problem. I want to detect raindrops on the image. I trained a simple model, but my prediction is not good. I want to have a prediction between 0 and 1.

For my first try, i used relu for all layers accept the final(I used softmax). As the optimizer, i used binary_crossentropy and i changed it into the categorical_crossentropy. Both of them didn't work.

opt = Adam(lr=LEARNING_RATE, decay=LEARNING_RATE / EPOCHS)

cnNetwork.compile(loss='categorical_crossentropy',
              optimizer=optimizers.RMSprop(lr=lr),
              metrics=['accuracy']) 


inputShape = (height, width, depth)

    # if we are using "channels first", update the input shape
    if K.image_data_format() == "channels_first":
        inputShape = (depth, height, width)

    # First layer is a convolution with 20 functions and a kernel size of 5x5 (2 neighbor pixels on each side)
    model.add(Conv2D(20, (5, 5), padding="same",
        input_shape=inputShape))
    # our activation function is ReLU (Rectifier Linear Units)
    model.add(Activation("relu"))
    # second layer is maxpooling 2x2 that reduces our image resolution by half 
    model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))

    # Third Layer - Convolution, twice the size of the first convoltion
    model.add(Conv2D(40, (5, 5), padding="same"))
    model.add(Activation("relu"))
    model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))

    # Fifth Layer is Full connected flattened layer that makes our 3D images into 1D arrays
    model.add(Flatten())
    model.add(Dense(500))
    model.add(Activation("relu"))

    # softmax classifier
    model.add(Dense(classes))
    model.add(Activation("softmax"))

I expect to get for ex .1 for the first class and .9 for the second. In the result, i get 1 , 1.3987518e-35. The main problem is that i always get 1 as a prediction.

Upvotes: 0

Views: 59

Answers (1)

Ernest S Kirubakaran
Ernest S Kirubakaran

Reputation: 1564

You should be using binary_crossentropy and there is nothing wrong in the output you have got. The output 1 , 1.3987518e-35 means the probability of first class is almost 1 and probability of second class is very close to 0 (1e-35).

Upvotes: 2

Related Questions