user9499285
user9499285

Reputation:

Input for LSTM in case of time series data

I have a (not text) dataset of size (1152,151). I wished to divide this into 8 batches, each containing 144 samples for training a LSTM network with it in Keras. I reshaped the data for sending into a LSTM as (8,144,151). Is this the right input shape? Because when I sent this as input and had return_sequences=False for this layer and the next LSTM layer as well, I got an error:

expected n_dim=3, got n_dim=2.

X_train = X_train.reshape((8,144,151))
def deepmodel():
  model = Sequential()
  model.add(LSTM(8,input_shape=(144,151),return_sequences=False))
  model.add(LSTM(8,return_sequences=False))
  model.add(Dense(8))
  model.add(Activation('softmax'))
  adam=Adam()
  model.compile(loss = 'categorical_crossentropy', optimizer = adam)
return model

Upvotes: 2

Views: 422

Answers (1)

cemsazara
cemsazara

Reputation: 1673

You will set your batch size in model.fit(...., batch_size=8). Look at the example below, it should clear your error message. If you are looking for multiple time lag, be sure to check out this wonderful blog post.

X_train = X_train.reshape((X_train.shape[0], 1, X_train.shape[1]))
def deepmodel():
  model = Sequential()
  model.add(LSTM(8,input_shape=(train_X.shape[1], train_X.shape[2]), return_sequences=False))
  model.add(LSTM(8,return_sequences=False))
  model.add(Dense(8))
  model.add(Activation('softmax'))
  adam=Adam()
  model.compile(loss = 'categorical_crossentropy', optimizer = adam)
return model

Upvotes: 2

Related Questions