Reputation: 39
I want to build a deep RNN where my x_train
shape is (318,39)
and my y_train
has shape (318,)
. When I execute the code below:
model.add(LSTM(20,input_shape=(X_train.shape[1:]), activation='relu', return_sequences=True))
model.add(LSTM(20, activation='relu'))
model.add(Dense(10, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
model.compile(optimizer='adam',loss='binary_crossentropy',metrics=['accuracy'])
history = model.fit(X_train,y_train,batch_size=20,epochs=250)
I'm getting following error:
ValueError: Input 0 is incompatible with layer lstm_60: expected ndim=3, found ndim=2
Upvotes: 1
Views: 451
Reputation: 11198
Just reshape the X_train
X_train.reshape(X_train.shape[0],X_train.shape[1],1)
before fit function.
LSTM needs a 3D array (batch size, timesteps, features). In your case, as you have 1 feature, you need to add one extra dimension.
Upvotes: 1
Reputation: 1377
Since you are using LSTM
, I am assuming your input data is sequential, i.e., you have 318 examples where each example has 39 time steps? If that is the case, you should first reshape your input data properly such as:
import numpy as np
X_train = np.expand_dims(X_train, -1)
This will reshape your X_train to have a shape of (318, 39, 1)
and then it will work (Only if my initial assumption is correct)
Upvotes: 2
Reputation: 8585
The expected input shape of the LSTM layer is [batch, timesteps, feature]
. You're passing [batch, timesteps]
. What you want to do is to pass [batch, timesteps, 1]
(expand dimension on the right). You could do it like this:
X_train = X_train[..., None]
Upvotes: 1