yamini goel
yamini goel

Reputation: 539

ValueError: Input 0 is incompatible with layer lstm_1: expected ndim=3, found ndim=2 [keras]

I got the error: ValueError: Input 0 is incompatible with layer lstm_1: expected ndim=3, found ndim=2 with the following code:

def make_model():
  model = Sequential()      

  model.add(Conv2D(20,(5,5), input_shape = (24,48,30), activation = "relu", strides = 1, padding = "valid"))
  model.add(MaxPooling2D(pool_size=(2,2)))        
  model.add(Conv2D(50, (5,5), use_bias = 50))    
  model.add(MaxPooling2D(pool_size=(2,2)))    
  model.add(Flatten())
  model.add(Dense(20, activation = "relu"))
  model.add(LSTM(50, activation="relu", return_sequences=True))

  return model

My input is 30 matrices of size 24*48 each.

Upvotes: 3

Views: 9431

Answers (1)

George
George

Reputation: 5681

The problem lies in the fact that after the last Dense layer (before the lstm layer), the output shape is (?, 20) and the lstm layer expects 3D tensor, not 2D.So, you can expand the dimensions in order to add one more before feeding to lstm layer.

You can expand dimensions using tf.expand_dims (assuming you use tensorflow as backend) tf expand

input_layer = Input((30,24,48))

model = Conv2D(20,(5,5), input_shape = (30,24,48), activation = "relu", strides = 1, padding = "valid")(input_layer)
model = MaxPooling2D(pool_size=(2,2))(model)        
model = Conv2D(50, (5,5), use_bias = 50)(model)    
model = MaxPooling2D(pool_size=(2,2))(model)  
model = Flatten()(model)
model = Dense(20, activation = "relu")(model)
model = tf.expand_dims(model, axis=-1)
model = LSTM(50, activation="relu", return_sequences=True)(model)

(I didn't use Sequential mode, I am using functional api since it is more flexible)

If you want to use sequential model:

    model = Sequential()      

    model.add(Conv2D(20,(5,5), input_shape = (30, 24, 48), activation = "relu", strides = 1, padding = "valid"))
    model.add(MaxPooling2D(pool_size=(2,2)))        
    model.add(Conv2D(50, (5,5), use_bias = 50))    
    model.add(MaxPooling2D(pool_size=(2,2)))    
    model.add(Flatten())
    model.add(Dense(20, activation = "relu"))
    model.add(Lambda(lambda x: tf.expand_dims(model.output, axis=-1)))
    model.add(LSTM(50, activation="relu", return_sequences=True))

you must use expand dims inside Lambda

Upvotes: 6

Related Questions