Reputation: 293
I have a dataset where x_train shape is (34650,10,1) , y_train shape is (34650,) , x_test shape is (17067,10,1) and y_test is (17067,) .
I am making a simple cnn model -
input_layer = Input(shape=(10, 1))
conv2 = Conv1D(filters=64,
kernel_size=3,
strides=1,
activation='relu')(input_layer)
pool1 = MaxPooling1D(pool_size=1)(conv2)
drop1 = Dropout(0.5)(pool1)
pool2 = MaxPooling1D(pool_size=1)(drop1)
conv3 = Conv1D(filters=64,
kernel_size=3,
strides=1,
activation='relu')(pool2)
drop2 = Dropout(0.5)(conv3)
conv4 = Conv1D(filters=64,
kernel_size=3,
strides=1,
activation='relu')(drop2)
pool3 = MaxPooling1D(pool_size=1)(conv4)
conv5 = Conv1D(filters=64,
kernel_size=3,
strides=1,
activation='relu')(pool3)
output_layer = Dense(1, activation='sigmoid')(conv5)
model_2 = Model(inputs=input_layer, outputs=output_layer)
But when i am trying to fit the model
model_2.compile(loss='mse',optimizer='adam')
model_2 = model_2.fit(x_train, y_train,
batch_size=128,
epochs=2,
verbose=1,
validation_data=(x_test, y_test))
I am getting this error
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-177-aee9b3241a20> in <module>()
4 epochs=2,
5 verbose=1,
----> 6 validation_data=(x_test, y_test))
2 frames
/usr/local/lib/python3.6/dist-packages/keras/engine/training_utils.py in standardize_input_data(data, names, shapes, check_batch_axis, exception_prefix)
133 ': expected ' + names[i] + ' to have ' +
134 str(len(shape)) + ' dimensions, but got array '
--> 135 'with shape ' + str(data_shape))
136 if not check_batch_axis:
137 data_shape = data_shape[1:]
ValueError: Error when checking target: expected dense_14 to have 3 dimensions, but got array with shape (34650, 1)
The shape of x_train and x_test is already 3 dimensional, then why it is showing this error
Upvotes: 2
Views: 57
Reputation: 22021
this is because your input is 3d and your target is 2d. Inside your network there isn't anything that enables you to pass from to 3d to 2d. to do this you can use global pooling or flatten. below an example
n_sample = 100
X = np.random.uniform(0,1, (n_sample,10,1))
y = np.random.randint(0,2, n_sample)
input_layer = Input(shape=(10, 1))
conv2 = Conv1D(filters=64,
kernel_size=3,
strides=1,
activation='relu')(input_layer)
pool1 = MaxPooling1D(pool_size=1)(conv2)
drop1 = Dropout(0.5)(pool1)
pool2 = MaxPooling1D(pool_size=1)(drop1)
conv3 = Conv1D(filters=64,
kernel_size=3,
strides=1,
activation='relu')(pool2)
drop2 = Dropout(0.5)(conv3)
conv4 = Conv1D(filters=64,
kernel_size=3,
strides=1,
activation='relu')(drop2)
pool3 = MaxPooling1D(pool_size=1)(conv4)
conv5 = Conv1D(filters=64,
kernel_size=3,
strides=1,
activation='relu')(pool3)
x = GlobalMaxPool1D()(conv5) # =====> from 3d to 2d (also GlobalAvg1D or Flatten are ok)
output_layer = Dense(1, activation='sigmoid')(x)
model_2 = Model(inputs=input_layer, outputs=output_layer)
model_2.compile('adam', 'binary_crossentropy')
model_2.fit(X,y, epochs=3)
Upvotes: 1