Utpal Datta
Utpal Datta

Reputation: 456

LSTM Cells, Input Shape Error

I am trying to build LSTM Net with my input data with 41 fields. My idea is that current output is a function of current inputs as well as 49 previous inputs. I am trying to run following:

CommonModel = Sequential()
CommonModel.add(LSTM(50, return_sequences=True, input_shape=(None, 41)))
CommonModel.add(LSTM(50, return_sequences=True))
CommonModel.add(LSTM(50))
CommonModel.add(Dense(20,activation='relu'))
CommonModel.add(Dense(10,activation='relu'))
CommonModel.add(Dense(1,activation='relu'))
CommonModel.compile(loss = 'mse', optimizer = 'adam', metrics=['accuracy'])
CommonModel.summary()

Layer (type) Output Shape Param #

dense_41 (Dense) (None, None, 50) 2100


dense_42 (Dense) (None, None, 20) 1020


dense_43 (Dense) (None, None, 10) 210


dense_44 (Dense) (None, None, 1) 11

Total params: 3,341 Trainable params: 3,341 Non-trainable params: 0


CommonModel.fit(Axis_X,Axis_Y,epochs=140,batch_size=64)

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-90-f80115738f18> in <module>()
----> 1 CommonModel.fit(Axis_X,Axis_Y,epochs=140,batch_size=64)

~\Anaconda3\lib\site-packages\keras\models.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, **kwargs)
    961                               initial_epoch=initial_epoch,
    962                               steps_per_epoch=steps_per_epoch,
--> 963                               validation_steps=validation_steps)
    964 
    965     def evaluate(self, x=None, y=None,

~\Anaconda3\lib\site-packages\keras\engine\training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, **kwargs)
   1628             sample_weight=sample_weight,
   1629             class_weight=class_weight,
-> 1630             batch_size=batch_size)
   1631         # Prepare validation data.
   1632         do_validation = False

~\Anaconda3\lib\site-packages\keras\engine\training.py in _standardize_user_data(self, x, y, sample_weight, class_weight, check_array_lengths, batch_size)
   1474                                     self._feed_input_shapes,
   1475                                     check_batch_axis=False,
-> 1476                                     exception_prefix='input')
   1477         y = _standardize_input_data(y, self._feed_output_names,
   1478                                     output_shapes,

~\Anaconda3\lib\site-packages\keras\engine\training.py in _standardize_input_data(data, names, shapes, check_batch_axis, exception_prefix)
    111                         ': expected ' + names[i] + ' to have ' +
    112                         str(len(shape)) + ' dimensions, but got array '
--> 113                         'with shape ' + str(data_shape))
    114                 if not check_batch_axis:
    115                     data_shape = data_shape[1:]

ValueError: Error when checking input: expected dense_41_input to have 3 dimensions, but got array with shape (1827, 41)

I tried with input_shape = (41) which doesn't work at all.

Could you please let me know wht's wrong?

Upvotes: 0

Views: 162

Answers (1)

nuric
nuric

Reputation: 11225

The model code setup seems fine. You need to convert your data into a timeseries for the LSTM as your first layer. input_shape=(49, 41) would mean 49 timesteps and 41 features for every timestep. You can use TimeseriesGenerator (documentation) to window your data in that manner. Something along the lines of:

data_gen = TimeseriesGenerator(data, targets, length=49)

Upvotes: 0

Related Questions