Reputation: 11
I am running an experiment which has the goal to classify EEG time series data in 3 classes. However, whenever I run training, my Loss is NaN and the accuracy is 0.0.
My data is 150 steps long and has 4 channels. It is all normalized between 0 and 1.
I am feeding them into the following model.
model = Sequential()
model.add(Conv1D(8, kernel_size=(2,), strides=(1,),
activation='relu',
input_shape=(input_width, num_channels)))
model.add(MaxPooling1D(pool_size=2, strides=(2,), padding='same'))
model.add(Dropout(0.25))
model.add(Conv1D(9, kernel_size=(2,), strides=(1,),
activation='relu'))
model.add(MaxPooling1D(pool_size=2, strides=(2,), padding='same'))
model.add(Dropout(0.25))
model.add(Conv1D(18, kernel_size=(2,), strides=(1,),
activation='relu'))
model.add(MaxPooling1D(pool_size=2, strides=(2,), padding='same'))
model.add(Dropout(0.25))
model.add(Conv1D(36, kernel_size=(2,), strides=(1,),
activation='relu'))
model.add(MaxPooling1D(pool_size=2, strides=(2,), padding='same'))
model.add(Dropout(0.25))
model.add(Conv1D(72, kernel_size=(2,), strides=(1,),
activation='relu'))
model.add(MaxPooling1D(pool_size=2, strides=(2,), padding='same'))
model.add(Dropout(0.25))
model.add(Flatten())
model.add(Dropout(0.5))
model.add(Dense(128, activation='tanh'),)
model.add(Dense(num_labels, activation='softmax'))
and then run it
optimizer = Adam(lr=0.0001)
model.summary()
model.compile(optimizer=optimizer,
loss='categorical_crossentropy',
metrics=['accuracy'])
model.fit(X, labels,
epochs=100,
batch_size=32)
However, the result is this:
Epoch 1/100
3855/3855 [==============================] - 24s 6ms/step - loss: nan - acc: 0.3331
Epoch 2/100
3855/3855 [==============================] - 25s 7ms/step - loss: nan - acc: 0.3331
.....
Epoch 100/100
3855/3855 [==============================] - 25s 7ms/step - loss: nan - acc: 0.3331
Upvotes: 1
Views: 1849
Reputation: 334
I generated synthetic data and trained with your codes. The NaN issue doesn't occur to me. You probably need to check your data to see if there're any corruptions. Another thing to try is to only keep one Conv/Pooling/Dropout layer and see if the issue still occurs.
Upvotes: 1