Reputation: 1
I'm a beginer in this field of Deep Learning. I'm trying to use Keras for a LSTM in a regression problem. I would like to build an ANN which could exploit the memory cell between one prediction and the next one.
In more details... I have a neural network (Keras) with 2 Hidden layer-LSTM and 1 output layer for a regression context.
The batch_size
is equal to 7, timestep
equal to 1 and I have 5749 samples.
I'm only interested to understand if using timestep == 1
is the same thing as using an MLP instead of LSTM. For time_step
, I'm referring to the reshape phase for the input of the Sequential model in Keras. The output is a single regression.
I'm not interested in the previous inputs, but I'm interested only on the output of the network as an information for the next prediction.
Thank you in advance!
Upvotes: 0
Views: 396
Reputation: 86600
You can say so :)
You're right in thinking that you won't have any recurrency anymore.
But internally, there will be still more operations than in regular Dense
layers, due to the existence of more kernels.
But be careful:
stateful=True
, it will still be a recurrent LSTM! If you're interested in creating custom operations with the memory/state of the cells, you could try creating your custom recurrent cell taking the LSTMCell code as a template.
Then you'd use that cell in a RNN(CustomCell, ...)
layer.
Upvotes: 2