Reputation: 65
I'm currently trying to scale up a time series example I found in a book. I have been trying to move it to the functional API but I'm having problems. The error I'm experienced in the functional model is:
Traceback (most recent call last): File "merge_n.py", line 57, in lstm = LSTM(4, batch_input_shape=(batch_size, look_back, 1), stateful=True)(inputs) File "/Users/pjhampton/Desktop/MTL/lib/python3.5/site-packages/keras/layers/recurrent.py", line 243, in call return super(Recurrent, self).call(inputs, **kwargs) File "/Users/pjhampton/Desktop/MTL/lib/python3.5/site-packages/keras/engine/topology.py", line 541, in call self.assert_input_compatibility(inputs) File "/Users/pjhampton/Desktop/MTL/lib/python3.5/site-packages/keras/engine/topology.py", line 440, in assert_input_compatibility str(K.ndim(x))) ValueError: Input 0 is incompatible with layer lstm_1: expected ndim=3, found ndim=4
Sequential Model (original)
########################################################
# main input
########################################################
look_back = 5
trainX, trainY = create_dataset(train, look_back)
testX, testY = create_dataset(test, look_back)
# reshape input to be [samples, time steps, features]
trainX = numpy.reshape(trainX, (trainX.shape[0], trainX.shape[1], 1))
testX = numpy.reshape(testX, (testX.shape[0], testX.shape[1], 1))
batch_size = 1
model = Sequential()
model.add(LSTM(4, batch_input_shape=(batch_size, look_back, 1), stateful=True))
model.add(Dense(1))
model.compile(loss='mse', optimizer='adam')
for i in range(100):
model.fit(trainX, trainY, epochs=1, batch_size=batch_size, verbose=2, shuffle=False)
model.reset_states()
Functional API based model (What I've tried)
inputs = Input(shape=(batch_size, look_back, 1))
lstm = LSTM(4, batch_input_shape=(batch_size, look_back, 1), stateful=True)(inputs)
dense = Dense(1)(lstm)
model = Model(inputs=inputs, outputs=dense)
model.compile(loss='mse', optimizer='adam')
for i in range(100):
model.fit(trainX, trainY, epochs=1, batch_size=batch_size, verbose=2, shuffle=False)
model.reset_states()
Full code: https://friendpaste.com/3Zg3VKBs3qd7FNXubNONzN
Upvotes: 0
Views: 911
Reputation: 4050
You've specified that your RNN is stateful, therefore you need to specify the batch_shape
in the input.
inputs = Input(batch_shape=(batch_size, look_back, 1))
lstm = LSTM(4, stateful=True)(inputs)
dense = Dense(1)(lstm)
model = Model(inputs=inputs, outputs=dense)
model.compile(loss='mse', optimizer='adam')
for i in range(100):
model.fit(trainX, trainY, epochs=1, batch_size=batch_size, verbose=2, shuffle=False)
model.reset_states()
It seems that the sequential model is exactly what you are looking for though.
Upvotes: 2