Azaria Gebremichael
Azaria Gebremichael

Reputation: 762

Tensorflow and keras model raises type error when calling load_model

Was playing around with a covid dataset and made a neural network model using transfer learning(DenseNet pre-trained model) and made the model below.

import tensorflow as tf
from tensorflow.keras.layers import *
from tensorflow.keras.regularizers import l1_l2
model6 = Sequential()
model6.add(data_augmentation)
base_model = tf.keras.applications.DenseNet201(input_shape=(100,100,3), include_top=False, pooling='max',weights='imagenet')
model6.add(base_model)
model6.add(BatchNormalization())
model6.add(Dense(2048, activation='relu', kernel_regularizer=l1_l2(0.01)))
model6.add(BatchNormalization())
model6.add(Dense(1, activation='sigmoid'))
for layer in model6.layers:
  layer.trainable = True

model6.compile(loss='binary_crossentropy', optimizer=tf.keras.optimizers.Adam(lr=1e-4), metrics=['accuracy'])

stacked_img = np.stack((X_train,)*3, axis=-1)
history6 = model6.fit(stacked_img, y_train, 
validation_split=0.2, callbacks = [es_callback], 
epochs=10, verbose=1,
validation_steps=22)

evaluation works and gets about 93% accuracy using

model6.evaluate(X_test_stacked, y_test)

I can predict on a new image in Colab using the following code

data = [] # initialize an empty numpy array
image_size = 100 # image size taken is 100 here. one can take other size too
img_path= '/content/Image_1.jpg'
img_array = cv2.imread(img_path, cv2.IMREAD_GRAYSCALE) # converting the image to gray scale
img_array = cv2.resize(img_array, (image_size,image_size)) # resizing the image array
img_array = np.stack((img_array,)*3, axis=-1)
img_array = np.expand_dims(img_array, axis=0)
# img_array = np.array(img_array)
prediction = model6.predict(img_array)

I decided to take the model out of Colab, saving it using model.save() in as a hdf5 file and put it into a flask app but it keeps raising an error

TypeError: ('Keyword argument not understood:', 'fill_value')

in the line that I am calling load_model

from tensorflow.keras.models import load_model    
model = load_model('models/binary-covid-model-6.h5')

...

The debugger caught an exception in your WSGI application. You can now look at the traceback which led to the error.
To switch between the interactive traceback and the plaintext one, you can click on the "Traceback" headline. From the text traceback you can also create a paste of it. For code execution mouse-over the frame you want to debug and click on the console icon on the right side.

You can execute arbitrary Python code in the stack frames and there are some extra helpers available for introspection:

dump() shows all variables in the frame
dump(obj) dumps all that's known about the object

when I try to access it in a browser.

Upvotes: 0

Views: 1044

Answers (1)

Azaria Gebremichael
Azaria Gebremichael

Reputation: 762

I now realize you can not use load_model for a model trained using TensorFlow 2.4.0 on an environment using Tensorflow 2.3.0 as that environment works on a model trained using 2.3.0, probably a bug or update in this Tensorflow version so models trained using TensorFlow 2.4.0 will give an error when called upon using load_model() in a Tensorflow 2.3.0 environment.

Upvotes: 1

Related Questions