Reputation: 379
Dear all: I'm very new to deep learning. I was trying to add a for loop to test all the possible combinations to get the best result. Currently what I have is the following.
def coeff_determination(y_true, y_pred):
SS_res = K.sum(K.square( y_true-y_pred ))
SS_tot = K.sum(K.square( y_true - K.mean(y_true) ) )
return ( 1 - SS_res/(SS_tot + K.epsilon()) )
x_train, x_test, y_train, y_test = train_test_split(x, y, test_size=0.3)
x_train = x_train.to_numpy()
x_test = x_test.to_numpy()
y_train = y_train.to_numpy()
y_test = y_test.to_numpy()
print(x_train.shape)
print(y_train.shape)
print(x_test.shape)
print(y_test.shape)
(37118, 105)
(37118,)
(15908, 105)
(15908,)
timesteps = 3
features = 35 #this is the number of features
x_train = x_train.reshape((x_train.shape[0], timesteps, features))
x_test = x_test.reshape((x_test.shape[0], timesteps, features))
dense_layers=[0, 1, 2]
layer_sizes=[32, 64, 128]
LSTM_layers=[1,2,3]
for dense_layer in dense_layers:
for layer_size in layer_sizes:
for LSTM_layer in LSTM_layers:
NAME="{}-lstm-{}-nodes-{}-dense-{}".format(LSTM_layer, layer_size, dense_layer, int(time.time()))
tensorboard = TensorBoard(log_dir=f"LSTM_logs\\{NAME}")
print(NAME)
model = Sequential()
model.add(LSTM(layer_size, input_shape=(x_train.shape[1], x_train.shape[2]), return_sequences=True))
for i in range(LSTM_layer-1):
model.add(LSTM(layer_size, input_shape=(x_train.shape[1], x_train.shape[2]), return_sequences=True))
for i in range(dense_layer):
model.add(Dense(layer_size))
model.add(Dense(1))
model.compile(loss='mae', optimizer='adam',metrics=[coeff_determination])
epochs = 10
result = model.fit(x_train, y_train, epochs=epochs, batch_size=72, validation_data=(x_test, y_test), verbose=2, shuffle=False)
However, a got a traceback says the following
ValueError: Error when checking target: expected dense_192 to have 3 dimensions, but got array with shape (37118, 1)
and the error occurs in the following line.
---> 19 result = model.fit(x_train, y_train, epochs=epochs, batch_size=72, validation_data=(x_test, y_test), verbose=2, shuffle=False)
Could anyone please kindly give me some hint regarding how to solve the problem. Thanks a lot for your time and support.
Sincerely
Wilson
Upvotes: 0
Views: 48
Reputation: 786
Use return_sequence = False for your last LSTM layer so it only returns a vector with the last hidden state.
Sincerely,
Alexander
more details: How to use return_sequences option and TimeDistributed layer in Keras?
Upvotes: 1