Reputation: 943
I need to use Bagging method for LSTM, training on Time-Series data. I have defined the model base and use the KerasRegressor to link to scikit-learn. But have AttributeError: 'KerasRegressor' object has no attribute 'loss'. How can I fix it?
Update: I have used the method of Manoj Mohan (at the first comment) and successful at the fit step. However, the problem comes as TypeError when I modify the class of Manoj Mohan to
class MyKerasRegressor(KerasRegressor):
def fit(self, x, y, **kwargs):
x = np.expand_dims(x, -2)
super().fit(x, y, **kwargs)
def predict(self, x, **kwargs):
x = np.expand_dims(x, -2)
super().predict(x, **kwargs)
It has solved the dimension problem of predict() which the same as the .fit(). The problem is:
TypeError Traceback (most recent call last)
<ipython-input-84-68d76cb73e8b> in <module>
----> 1 pred_bag = bagging_model.predict(x_test)
TypeError: unsupported operand type(s) for +: 'int' and 'NoneType'
Full script:
def model_base_LSTM():
model_cii = Sequential()
# Make layers
model_cii.add(CuDNNLSTM(50, return_sequences=True,input_shape=((1, 20))))
model_cii.add(Dropout(0.4))
model_cii.add(CuDNNLSTM(50, return_sequences=True))
model_cii.add(Dropout(0.4))
model_cii.add(CuDNNLSTM(50, return_sequences=True))
model_cii.add(Dropout(0.4))
model_cii.add(CuDNNLSTM(50, return_sequences=True))
model_cii.add(Dropout(0.4))
model_cii.add(Flatten())
# Output layer
model_cii.add(Dense(1))
# Compile
model_cii.compile(optimizer = 'adam', loss = 'mean_squared_error', metrics=['accuracy'])
return model_cii
model = MyKerasRegressor(build_fn = model_base_LSTM, epochs=100, batch_size =70)
bagging_model = BaggingRegressor(base_estimator=model, n_estimators=10)
train_model = bagging_model.fit(x_train, y_train)
bagging_model.predict(x_test)
Output:
TypeError Traceback (most recent call last)
<ipython-input-84-68d76cb73e8b> in <module>
----> 1 pred_bag = bagging_model.predict(x_test)
TypeError: unsupported operand type(s) for +: 'int' and 'NoneType'
Upvotes: 1
Views: 574
Reputation: 6044
There is an error in the model_base_LSTM()
method. Replace
return model
with
return model_cii
Fix for "Error when checking input", an extra dimension could be added like this. This also takes care of scikit-learn(2 dimensions) vs Keras LSTM(3 dimensions) problem. Create a subclass of KerasRegressor to handle the dimension mismatch.
class MyKerasRegressor(KerasRegressor):
def fit(self, x, y, **kwargs):
x = np.expand_dims(x, -2)
return super().fit(x, y, **kwargs)
def predict(self, x, **kwargs):
x = np.expand_dims(x, -2)
return super().predict(x, **kwargs)
model = MyKerasRegressor(build_fn = model_base_LSTM, epochs=100, batch_size =70)
Upvotes: 1