Minions
Minions

Reputation: 5487

Neural network in keras not converging

I'm building a simple Neural network in Keras, like the following:

# create model
model = Sequential()
model.add(Dense(1000, input_dim=x_train.shape[1], activation='relu'))
model.add(Dense(1, activation='sigmoid'))

# Compile model
model.compile(loss='mean_squared_error', metrics=['accuracy'], optimizer='RMSprop')
# Fit the model
model.fit(x_train, y_train, epochs=20, batch_size=700, verbose=2)
# evaluate the model
scores = model.evaluate(x_test, y_test, verbose=0)
print("\n%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))

The shape of the used data is:

x_train = (49972, 601) 
y_train = (49972, 1)

My problem is that the network is not converging, the accuracy is fixed on 0.0168, like below:

Epoch 1/20
 - 1s - loss: 3.2222 - acc: 0.0174
Epoch 2/20
 - 1s - loss: 3.1757 - acc: 0.0187
Epoch 3/20
 - 1s - loss: 3.1731 - acc: 0.0212
Epoch 4/20
 - 1s - loss: 3.1721 - acc: 0.0220
Epoch 5/20
 - 1s - loss: 3.1716 - acc: 0.0225
Epoch 6/20
 - 1s - loss: 3.1711 - acc: 0.0235
Epoch 7/20
 - 1s - loss: 3.1698 - acc: 0.0245
Epoch 8/20
 - 1s - loss: 3.1690 - acc: 0.0251
Epoch 9/20
 - 1s - loss: 3.1686 - acc: 0.0257
Epoch 10/20
 - 1s - loss: 3.1679 - acc: 0.0261
Epoch 11/20
 - 1s - loss: 3.1674 - acc: 0.0267
Epoch 12/20
 - 1s - loss: 3.1667 - acc: 0.0277
Epoch 13/20
 - 1s - loss: 3.1656 - acc: 0.0285
Epoch 14/20
 - 1s - loss: 3.1653 - acc: 0.0288
Epoch 15/20
 - 1s - loss: 3.1653 - acc: 0.0291

I used Sklearn library to build the same structure with the same data, and it works perfectly, shown me an accuracy higher than 0.5:

model = Pipeline([
        ('classifier', MLPClassifier(hidden_layer_sizes=(1000), activation='relu',
                                     max_iter=20, verbose=2, batch_size=700, random_state=0))
    ])

I'm totally sure that I used the same data for both models, and this is how I prepare it:

def load_data():
    le = preprocessing.LabelEncoder()
    with open('_DATA_train.txt', 'rb') as fp:
        train = pickle.load(fp)
    with open('_DATA_test.txt', 'rb') as fp:
        test = pickle.load(fp)

    x_train = train[:,0:(train.shape[1]-1)]
    y_train = train[:,(train.shape[1]-1)]
    y_train = le.fit_transform(y_train).reshape([-1,1])

    x_test = test[:,0:(test.shape[1]-1)]
    y_test = test[:,(test.shape[1]-1)]
    y_test = le.fit_transform(y_test).reshape([-1,1])

    print(x_train.shape, '  ' , y_train.shape)
    print(x_test.shape, '  ' , y_test.shape)
    return x_train, y_train, x_test, y_test

What is the problem with the Keras structure?

Edited:

it's a multi-class classification problem: y_training [0 ,1, 2, 3]

Upvotes: 2

Views: 2642

Answers (1)

Jeremy Bare
Jeremy Bare

Reputation: 550

For a multiclass problem your labels should be one hot encoded. For example if the options are [0 ,1, 2, 3] and the label is 1 then it should be [0, 1, 0, 0].

Your final layer should be a dense layer with 4 units and an activation of softmax.

model.add(Dense(4, activation='softmax'))

And your loss should be categorical_crossentropy

model.compile(loss='categorical_crossentropy', metrics=['accuracy'], optimizer='RMSprop')

Upvotes: 5

Related Questions