Reputation: 547
Been trying to make a neural network in Keras, but ran into an issue where there is a shape mismatch between one of my dense layers and activation layers. Am I missing something obvious? Using Tensorflow backend.
print(x_train.shape)
print(y_train.shape)
(1509, 476, 4)
(1509,)
Then my model is as follows:
###Setup Keras to create a bidirectional convolutional recurrent NN based on DanQ NN
###See https://github.com/uci-cbcl/DanQ
model = Sequential()
model.add(Conv1D(filters=320,
kernel_size=26,
padding="valid",
activation="relu",
strides=1,
input_shape=(476, 4)
))
model.add(MaxPooling1D(pool_size=13, strides=13))
model.add(Dropout(0.2))
model.add(keras.layers.wrappers.Bidirectional(LSTM(320, return_sequences=True, input_shape=(None, 320))))
model.add(Flatten())
model.add(Dense(input_dim=34*640, units=925))
model.add(Activation('relu'))
model.add(Dense(input_dim=925, units=919))
model.add(Activation('sigmoid'))
print('compiling model')
model.compile(loss='binary_crossentropy', optimizer='rmsprop', class_mode="binary")
print('running at most 60 epochs')
model.fit(x_train, y_train.T, batch_size=100, epochs=60, shuffle=True, verbose=2, validation_split=0.1)
tresults = model.evaluate(x_test, y_test, verbose=2)
print(tresults)
print(model.output_shape)
But I get the following error:
ValueError: Error when checking target: expected activation_48 to have shape (None, 919) but got array with shape (1509, 1)
The error seems to be originating from the input into the second activation layer using a sigmoid activation. e.g.:
model.add(Dense(input_dim=925, units=919))
model.add(Activation('sigmoid'))
Why would there be a mismatch?
Upvotes: 1
Views: 1312
Reputation: 236
As mentioned in @djk47463's comment, your output is now has 919 values per sample, because that is the number of units in the last layer of your network. To correct this, either set your last layer's units to 1, or add a new final layer with an output dimension of 1.
Upvotes: 1
Reputation: 167
In your code,
model.add(Conv1D(filters=320,
kernel_size=26,
padding="valid",
activation="relu",
strides=1,
input_shape=(476, 4)
))
Try to add input_dim = 4
on the place of input_shape = (476,4)
.
May be it will work.
Upvotes: 0