Re Dream
Re Dream

Reputation: 91

'KerasLayer' object has no attribute 'layers'

I'm trying to customize the model taken from tf hub but can't access the layers with following error 'KerasLayer' object has no attribute 'layers'

Here is my code as an example:

import tensorflow_hub as hub

from tensorflow.keras import layers

feature_extractor_url = "https://tfhub.dev/tensorflow/efficientnet/lite0/feature-vector/1" 

base_model = hub.KerasLayer(feature_extractor_url,
                                         input_shape=(224,224,3))

base_model.trainable = True


import tensorflow
from tensorflow.keras.models import Model

x =  base_model.layers[-10].output
x = tensorflow.keras.layers.Conv2D(4, (3, 3), padding="same", activation="relu")(x)
x = tensorflow.keras.layers.GlobalMaxPooling2D()(x)
x = tensorflow.keras.layers.Flatten()(x)
outputs = tensorflow.keras.layers.Activation('sigmoid', name="example_output")(x)

model = Model(base_model.input, outputs=outputs)

model.summary()

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-43-0501ec56d6c4> in <module>()
     14 from tensorflow.keras.models import Model
     15 
---> 16 x =  base_model.layers[-10].output
     17 x = tensorflow.keras.layers.Conv2D(4, (3, 3), padding="same", activation="relu")(x)
     18 x = tensorflow.keras.layers.GlobalMaxPooling2D()(x)

AttributeError: 'KerasLayer' object has no attribute 'layers'

What I've tried: I built the model using sequential api :

model = tf.keras.Sequential([
  base_model,
  layers.Dense(image_data.num_classes)
])

model.summary()

But still a I can't access the layers inside base_model.

How can I access the layers from KerasLayer?

Thank you!

Upvotes: 2

Views: 10374

Answers (3)

momo
momo

Reputation: 158

You can access the layers via weights of the Hub model.

The topic is not straightforwardly mentioned in the TF-docs unfortunately. This is the depth I could dig up to so far and hopefully it sheds some light on accessing layers on Hub.

TF 2.5.0 & TF-Hub 0.12.0 have been used for the below tests.

  1. Layers in KerasLayer object
>>> import tensorflow_hub as hub
>>> model = hub.KerasLayer("https://tfhub.dev/deepmind/ganeval-cifar10-convnet/1")
>>> model
<tensorflow_hub.keras_layer.KerasLayer object at 0x7f0c79372190>
>>> len(model.weights)
57
>>> model.weights[56]
<tf.Variable 'cifar10_convnet/linear/b:0' shape=(10,) dtype=float32, numpy=
array([-0.2734375 , -1.46875   ,  0.484375  ,  1.2265625 ,  0.53515625,
        0.96875   ,  0.3671875 ,  0.02282715, -0.7265625 , -1.078125  ],
      dtype=float32)>
>>> model.weights[56].name
'cifar10_convnet/linear/b:0'

Notice the above weights variable. KerasLayer has a get_weights() function as well. The difference in output is as below. Basically, the former is of type TF-Variable, and the latter is a numpy array.

>>> len(model.get_weights())
57
>>> model.get_weights()[56]
array([-0.2734375 , -1.46875   ,  0.484375  ,  1.2265625 ,  0.53515625,
        0.96875   ,  0.3671875 ,  0.02282715, -0.7265625 , -1.078125  ],
      dtype=float32)

To get, for example, the names of all layers, simply run:

layers = model.weights
[ layers[i].name for i in range( len(layers) ) ]

A hint of my output:

'cifar10_convnet/conv_net_2d/conv_2d_0/w:0',

'cifar10_convnet/conv_net_2d/conv_2d_0/b:0',

'cifar10_convnet/conv_net_2d/batch_norm_0/moving_mean:0',

'cifar10_convnet/conv_net_2d/batch_norm_0/moving_variance:0'

Note that weight, bias, mean, variance etc. are all listed separately as layers in the output.


  1. Layers in AutoTrackable object

This is for low-level TF2 API users.

>>> import tensorflow_hub as hub
>>> model = hub.load("https://tfhub.dev/deepmind/ganeval-cifar10-convnet/1")
>>> model
<tensorflow.python.training.tracking.tracking.AutoTrackable object at 0x7f95943ec410>
>>> len(model.variables)
57
>>> model.variables[56]
<tf.Variable 'cifar10_convnet/linear/b:0' shape=(10,) dtype=float32, numpy=
array([-0.2734375 , -1.46875   ,  0.484375  ,  1.2265625 ,  0.53515625,
        0.96875   ,  0.3671875 ,  0.02282715, -0.7265625 , -1.078125  ],
      dtype=float32)>

Use "variables" instead of "weights" with this API.

Upvotes: 3

the ycl
the ycl

Reputation: 1

I met the same problem as you yesterday,but luckily for me, i have found two ways to solve the problem.

1.base_model = tf.keras.applications.MobileNetV2(input_shape=IMG_SHAPE,
                                               include_top=False,
                                               weights='imagenet'
                                               )
base_model.trainable = False
global_average_layer = tf.keras.layers.GlobalAveragePooling2D()
prediction_layer = tf.keras.layers.Dense(59)
model = tf.keras.Sequential([
  base_model,
  global_average_layer,
  prediction_layer
])
model.summary()
....
fine_tune_at =100
for layer in base_model.layers[:fine_tune_at]:
  layer.trainable =  False

use the ' tf.keras.applications' with '.layers' together

2.Reference tensorflow documentation:https://tensorflow.google.cn/hub/tf2_saved_model?hl=en

base_model = hub.KerasLayer(feature_extractor_url,
                                     trainable=True,
                                     input_shape=(224,224,3))

model.summary() Model: "sequential_2"


Layer (type) Output Shape Param #

keras_layer_1 (KerasLayer) (None, 1024) 3228864


dense_2 (Dense) (None, 59) 60475

Total params: 3,289,339 Trainable params: 3,267,451 Non-trainable params: 21,888

I hope my answer will help you

Upvotes: -1

arnoegw
arnoegw

Reputation: 1238

The inner structure of the SavedModel loaded into a hub.KerasLayer is inaccessible. For this level of detail, you'll have to turn to EfficientNet source code instead.

Upvotes: 0

Related Questions