soartseng
soartseng

Reputation: 273

Zero accuracy training a neural network in Keras

I train a Neural Network of Regression Problem in Keras. Why the output is only one Dimension, the accuracy in each Epoch always show acc: 0.0000e+00?

like this:

1000/199873 [..............................] - ETA: 5s - loss: 0.0057 - acc: 0.0000e+00

2000/199873 [..............................] - ETA: 4s - loss: 0.0058 - acc: 0.0000e+00

3000/199873 [..............................] - ETA: 3s - loss: 0.0057 - acc: 0.0000e+00

4000/199873 [..............................] - ETA: 3s - loss: 0.0060 - acc: 0.0000e+00 ...

198000/199873 [============================>.] - ETA: 0s - loss: 0.0055 - acc: 0.0000e+00

199000/199873 [============================>.] - ETA: 0s - loss: 0.0055 - acc: 0.0000e+00

199873/199873 [==============================] - 4s - loss: 0.0055 - acc: 0.0000e+00 - val_loss: 0.0180 - val_acc: 0.0000e+00

Epoch 50/50

But if the output are two Dimension or above, no problem for accuracy.

My model as below:`

input_dim = 14
batch_size = 1000
nb_epoch = 50
lrelu = LeakyReLU(alpha = 0.1)

model = Sequential()
model.add(Dense(126, input_dim=input_dim)) #Dense(output_dim(also hidden wight), input_dim = input_dim)
model.add(lrelu) #Activation

model.add(Dense(252))
model.add(lrelu)
model.add(Dense(1))
model.add(Activation('linear'))

model.compile(loss= 'mean_squared_error', optimizer='Adam', metrics=['accuracy'])
model.summary()
history = model.fit(X_train_1, y_train_1[:,0:1],
                    batch_size=batch_size,
                    nb_epoch=nb_epoch,
                    verbose=1,
                    validation_split=0.2)

loss = history.history.get('loss')
acc = history.history.get('acc')
val_loss = history.history.get('val_loss')
val_acc = history.history.get('val_acc')

'''saving model'''
from keras.models import load_model
model.save('XXXXX')
del model

'''loading model'''
model = load_model('XXXXX')

'''prediction'''
pred = model.predict(X_train_1, batch_size, verbose=1)
ans = [np.argmax(r) for r in y_train_1[:,0:1]]

Upvotes: 23

Views: 29966

Answers (5)

Aaditya Ura
Aaditya Ura

Reputation: 12669

There can be few issues with your model, Check and fix

  1. Check your batch_size if it's too large or too small
  2. Check if the learning rate is too high or low
  3. if it's text dataset, check the length of sentences, if too large then trim with an average length
  4. Check if there are NaN values in your dataset, fix it
  5. Check if the dataset doesn't contain special symbols or characters
  6. If values are continuous, Normalize it before sending to the network ( check batch normalization )
  7. Try regularization techniques
  8. Check the activation functions
  9. Shuffle the data

Upvotes: 1

Anna Maule
Anna Maule

Reputation: 260

I ran into a similar problem, after trying all the suggestions and none of them working, I figured something must be wrong somewhere else.

After looking at my data distribution, I realized that I was not shuffling my data. So my training data was the majority of one class and my testing data was 100% another class. After shuffling the data the accuracy was no longer 0.0000e+00, it was something more meaningful.

Upvotes: 4

Austin Eaton
Austin Eaton

Reputation: 78

Just a quick add-on to the excellent answers already posted.

The following snippet is a custom metric that will display the average percentage difference between you NN's prediction and the actual value.

def percentage_difference(y_true, y_pred):
    return K.mean(abs(y_pred/y_true - 1) * 100)

to implement it into your metrics simply add it to the "metrics" option in your model compilation. I.e.

model.compile(loss= 'mean_squared_error', 
optimizer='Adam', metrics=['accuracy',percentage_difference])

Upvotes: 2

mikal94305
mikal94305

Reputation: 5083

The problem is that your final model output has a linear activation, making the model a regression, not a classification problem. "Accuracy" is defined when the model classifies data correctly according to class, but "accuracy" is effectively not defined for a regression problem, due to its continuous property.

Either get rid of accuracy as a metric and switch over to fully regression, or make your problem into a classification problem, using loss='categorical_crossentropy' and activation='softmax'.

This is a similar problem to yours: Link

For more information see: StackExchange

Upvotes: 28

hpwww
hpwww

Reputation: 565

I am not sure what your problem is, but your model looks a little weird to me.

This is your model:

lrelu = LeakyReLU(alpha = 0.1)
model = Sequential()
model.add(Dense(126, input_dim=15)) #Dense(output_dim(also hidden wight), input_dim = input_dim)
model.add(lrelu) #Activation

model.add(Dense(252))
model.add(lrelu)
model.add(Dense(1))
model.add(Activation('linear'))

and the visualization of your model is shown as below:

enter image description here

There are two layers which can be the output layer of your model, and you didn't decide which one is your actual output layer. I guess that's the reason you cannot make the correct prediction.

If you want to implement your model like this,

enter image description here

you should add your activation layer independently, rather than use the same one.

For example,

model = Sequential()
model.add(Dense(126, input_dim=15)) #Dense(output_dim(also hidden wight), input_dim = input_dim)
model.add(LeakyReLU(alpha = 0.1)) #Activation

model.add(Dense(252))
model.add(LeakyReLU(alpha = 0.1))
model.add(Dense(1))
model.add(Activation('linear'))

Upvotes: 3

Related Questions