Tom Dörr
Tom Dörr

Reputation: 1029

Keras model not learning

My Keras model is not learning anything and I can't figure out why. I even reduced the training set size to 5 elements and the model is still not fitting to the training data.

loss function visualized with TensorBoard

Here is my code:

model = Sequential()
model.add(Conv1D(30, filter_length=3, activation='relu', input_shape=(50, 1)))
model.add(Conv1D(40, filter_length=(3), activation='relu'))
model.add(Conv1D(120, filter_length=(3), activation='relu'))
model.add(Flatten())
model.add(Dense(1024, activation='relu'))
model.add(Dense(256, activation='relu'))
model.add(Dense(32, activation='relu'))
model.add(Dense(1, activation='relu'))
model.summary()
model.compile(loss='mse',
              optimizer=keras.optimizers.adam())


train_limit = 5 
batch_size = 4096 
tb = keras.callbacks.TensorBoard(log_dir='./logs/' + run_name + '/', 
    histogram_freq=0, write_images=False)
tb.set_model(model)
model.fit(X_train[:train_limit], y_train[:train_limit],
          batch_size=batch_size,
          nb_epoch=10**4,
          verbose=0,
          validation_data=(X_val[:train_limit], y_val[:train_limit]),
          callbacks=[tb])
score = model.evaluate(X_test, y_test, verbose=0)
print('Test loss:', score)
print('Test accuracy:', score)

Any help is greatly appreciated!

Upvotes: 3

Views: 1713

Answers (2)

Aditya Gupta
Aditya Gupta

Reputation: 43

The last layer of the model has relu activation. Instead, it should have a sigmoid activation function as it is a binary classification problem. If it were a multi class classification problem, you should use softmax activation in that case.

Upvotes: 1

Lan
Lan

Reputation: 6660

It seems to be a regression problem. One thing I noticed is that your last layer still has the ReLU activation function. I would recommend taking the ReLU out at the last layer.

Upvotes: 8

Related Questions