Reputation: 2829
I'm playing around with machine learning and trying to follow along with some examples but am stuck trying to get my data into a Keras LSTM layer.
I have some stock ticker data in a Pandas DataFrame which is resampled at 15 minute intervals with ohlc and a load of other metrics for each row.
My code is below. df is my DataFrame:
x = df.iloc[:, :-1].values
y = df.iloc[:, -1:].values
dimof_input = x.shape[1]
dimof_output = len(set(y.flat))
model = Sequential()
model.add(LSTM(4, input_dim=dimof_input, return_sequences=True))
model.compile(loss='mse', optimizer='rmsprop')
model.fit(x, y, nb_epoch=1, batch_size=1, verbose=2)
When I try and fit I get:
Error when checking input: expected lstm_16_input to have 3 dimensions,
but got array with shape (33, 100)
I've copied this from examples elsewhere. I can't quite see how to get the correct shape of data into this model. Can anyone help?
Thanks loads.
Upvotes: 2
Views: 2807
Reputation: 5070
Input shapes
3D tensor with shape (batch_size, timesteps, input_dim)
, (Optional) 2D tensors with shape (batch_size, output_dim)
. (from there).
And you specified input_dim=dimof_input
. Model expects 3D tensor as input, but got 2D. If you give a reference to the tutorial that is being implemented I probably will be able to say more about causes of the problem.
You could try reshape your input data as following:
x = x.reshape(x.shape[0], 1, x.shape[1])
Also, some information about input data of Keras LSTM layer input data could be found here.
Upvotes: 4