Reputation: 483
I am trying to train a basic LSTM network, but I am running into an error with model.fit. I have two datasets, each with 3145 sequences of length 7. I want to package these two datasets into the same time step. As such, I have reshaped my x_train and y_train into the following shapes:
x_train.shape = (3145, 70, 2)
y_train.shape = (3145, 70)
As you can see, I should have 3145 samples, each with 70 timesteps, and each with 2 features and a target. I then define the following model:
model = Sequential()
model.add(LSTM(4, input_shape=(x_train.shape[1], x_train.shape[2])))
model.add(Dense(1))
model.compile(loss='mean_squared_error', optimizer='adam')
To train, I run the line
model.fit(x_train, y_train, epochs=100, batch_size=1, verbose=2)
But this gives me the error
ValueError: Error when checking target: expected dense_23 to have shape (1,) but got array with shape (70,)
I'm confused as to why this error is occurring. With 70 time steps, I should have 70 targets right?
I would greatly appreciate any help explaining this error!
Upvotes: 0
Views: 210
Reputation: 41
You need the same units in input layer dense than the output dim of the LSTM layer. In your case the output of the LSTM layer is 70 like you can see in the shape.
try change by
model.add(Dense(70))
or let the model infer the amount of units of the layer like this.
model.add(Dense())
Upvotes: 1