Gehad
Gehad

Reputation: 66

AttributeError: 'Model' object has no attribute '_output_tensor_cache'

import keras
from keras.layers import Input, Dense
from keras.models import Model
from keras_adamw import AdamW

mlp = Model([
        Dense(10, activation='relu', input_shape=trainX_scaled.shape), #input shape
        Dense(10,  activation='relu'),  #Hiddin layer
        Dense(10, activation='relu') #output layer
])

optimizer = AdamW(lr=0.001,model=mlp)
mlp.compile(optimizer, loss='sparse_categorical_crossentropy', metrics=['accuracy'])
history = mlp.fit(trainX_scaled, train_y, epochs=500, validation_data=(valX_scaled, val_y), batch_size=1)

The error is

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-66-46d3a15c03c4> in <module>()
     19 optimizer = AdamW(lr=0.001,model=mlp)
     20 mlp.compile(optimizer, loss='sparse_categorical_crossentropy', metrics=['accuracy'])
---> 21 history = mlp.fit(trainX_scaled, train_y, epochs=500, validation_data=(valX_scaled, val_y), batch_size=1)

3 frames
/usr/local/lib/python3.7/dist-packages/keras/engine/network.py in call(self, inputs, mask)
    578         cache_key = object_list_uid(inputs)
    579         cache_key += '_' + object_list_uid(masks)
--> 580         if cache_key in self._output_tensor_cache:
    581             return self._output_tensor_cache[cache_key]
    582         else:

AttributeError: 'Model' object has no attribute '_output_tensor_cache'

The error occurs when running model.fit.

Upvotes: 0

Views: 1338

Answers (1)

user11530462
user11530462

Reputation:

Which tensorflow version have you installed in your system? Please import the keras libraries from tensorflow.keras and re-execute the above code.

from tensorflow.keras.layers import Input, Dense
from tensorflow.keras.models import Model

AdamW api is part of Tensorflow Addons package. To import AdamW optimizer, you need to use below code:

!pip install tensorflow-addons
import tensorflow_addons as tfa
from tensorflow_addons.optimizers import AdamW

Or can simply use Adam optimizer instead of using Adamw as below:

from tensorflow.keras.optimizers import Adam

Let us know if the issue still persists. Thank You.

Upvotes: 0

Related Questions