lmglm
lmglm

Reputation: 3

Implementing lambda preprocess function for VGGFace

Hi I'm trying to make a network using transfer learning while fine tunning a VGGFace implementation:

img_height, img_width = 224,224
module=VGGFace(model = 'resnet50',include_top = False,weights = 'vggface',input_shape = (img_height, img_width, 3))

model = tf.keras.Sequential([
    tf.keras.layers.InputLayer(input_shape=(img_height, img_width, 3)),             
    tf.keras.layers.Lambda(lambda x: utils.preprocess_input(np.expand_dims(x, axis=0), version=2)),
    module,
    tf.keras.layers.Flatten(),
    tf.keras.layers.Dense(256, activation="relu"),
    tf.keras.layers.Dense(num, "sigmoid"), 
])

But the following error comes up:

Cannot convert a symbolic Tensor (Placeholder:0) to a numpy array. This error may indicate that you're trying to pass a Tensor to a NumPy call, which is not supported

I guess it's because the utils.preprocess_input function which is meant for numpy arrays and not for tensors, but I need it to be in the network architecture since I'm dealing with a lot images and I can't store them all at once. Any suggestions how to make it work?

Upvotes: 0

Views: 113

Answers (1)

Sadegh Ranjbar
Sadegh Ranjbar

Reputation: 186

you are trying to expand the dimension of tensor with numpy expand_dims. try this:

utils.preprocess_input(tf.expand_dims(x, axis=0), version=2)

Upvotes: 0

Related Questions