Shaheryar Sohail
Shaheryar Sohail

Reputation: 573

NotImplementedError: Layers with arguments in `__init__` must override `get_config`

I'm trying to save my TensorFlow model using model.save(), however - I am getting this error.

The model summary is provided here: Model Summary

The code for the transformer model:

def transformer(vocab_size, num_layers, units, d_model, num_heads, dropout, name="transformer"):
    inputs = tf.keras.Input(shape=(None,), name="inputs")
    dec_inputs = tf.keras.Input(shape=(None,), name="dec_inputs")

    enc_padding_mask = tf.keras.layers.Lambda(
        create_padding_mask, output_shape=(1, 1, None),
        name='enc_padding_mask')(inputs)
    # mask the future tokens for decoder inputs at the 1st attention block
    look_ahead_mask = tf.keras.layers.Lambda(
        create_look_ahead_mask,
        output_shape=(1, None, None),
        name='look_ahead_mask')(dec_inputs)
    # mask the encoder outputs for the 2nd attention block
    dec_padding_mask = tf.keras.layers.Lambda(
        create_padding_mask, output_shape=(1, 1, None),
        name='dec_padding_mask')(inputs)

    enc_outputs = encoder(
        vocab_size=vocab_size,
        num_layers=num_layers,
        units=units,
        d_model=d_model,
        num_heads=num_heads,
        dropout=dropout,
    )(inputs=[inputs, enc_padding_mask])

    dec_outputs = decoder(
        vocab_size=vocab_size,
        num_layers=num_layers,
        units=units,
        d_model=d_model,
        num_heads=num_heads,
        dropout=dropout,
    )(inputs=[dec_inputs, enc_outputs, look_ahead_mask, dec_padding_mask])

    outputs = tf.keras.layers.Dense(units=vocab_size, name="outputs")(dec_outputs)

    return tf.keras.Model(inputs=[inputs, dec_inputs], outputs=outputs, name=name)

I don't understand why it's giving this error since the model trains perfectly fine. Any help would be appreciated.

My saving code for reference:

print("Saving the model.")
saveloc = "C:/tmp/solar.h5"
model.save(saveloc)
print("Model saved to: " + saveloc + " succesfully.")

Upvotes: 55

Views: 44576

Answers (6)

fathi mhiri
fathi mhiri

Reputation: 21

We have 2 model save formats:

Either TensorFlow SavedModel or HDF5

1/SavedModel saves the execution graph. Thus, SavedModels are able to save custom objects like subclassed models and custom layers without requiring the original code. If you only supply the name of the saved model, e.g. model.save('my_model'), then the saved model format will be TF savedmodel as default.

2/To save custom objects with HDF5 format:

You have get_config method in your object, and optionally a from_config classmethod etc. (see the documentation) If you supply the H5 file extension, e.g. model.save('my_model.h5'), then the saved model format will be HDF5.

I used the first method (saved model format) and it worked for me without errors.

Upvotes: 0

Anar Garib
Anar Garib

Reputation: 1

Adding save_format='tf' parameter solved it for me:

model.save(MODEL_PATH, save_format='tf')

Upvotes: 0

Shraddha Mishra
Shraddha Mishra

Reputation: 31

I suggest You try the following:

model = tf.keras.Model(...)
model.save_weights("some_path")
...
model.load_weights("some_path")

Upvotes: 3

John Nash
John Nash

Reputation: 71

This problem is caused by mixing imports between the keras and tf.keras libraries, which is not supported.

Use tf.keras.models or usr keras.models everywhere

You should never mix imports between these libraries, as it will not work and produces all kinds of strange error messages. These errors change with versions of keras and tensorflow.

Upvotes: 7

Hunaidkhan
Hunaidkhan

Reputation: 1418

I think simple solution is to install the tensorflow==2.4.2 for gpu tensorflow-gpu==2.4.2 , i faced the issue and debug the whole day but it was not resolved. finally i installed the older stable version and error is gone

Upvotes: 1

EliadL
EliadL

Reputation: 7068

It's not a bug, it's a feature.

This error lets you know that TF can't save your model, because it won't be able to load it.
Specifically, it won't be able to reinstantiate your custom Layer classes: encoder and decoder.

To solve this, just override their get_config method according to the new arguments you've added.

A layer config is a Python dictionary (serializable) containing the configuration of a layer. The same layer can be reinstantiated later (without its trained weights) from this configuration.


For example, if your encoder class looks something like this:

class encoder(tf.keras.layers.Layer):

    def __init__(
        self,
        vocab_size, num_layers, units, d_model, num_heads, dropout,
        **kwargs,
    ):
        super().__init__(**kwargs)
        self.vocab_size = vocab_size
        self.num_layers = num_layers
        self.units = units
        self.d_model = d_model
        self.num_heads = num_heads
        self.dropout = dropout

    # Other methods etc.

then you only need to override this method:

    def get_config(self):

        config = super().get_config().copy()
        config.update({
            'vocab_size': self.vocab_size,
            'num_layers': self.num_layers,
            'units': self.units,
            'd_model': self.d_model,
            'num_heads': self.num_heads,
            'dropout': self.dropout,
        })
        return config

When TF sees this (for both classes), you will be able to save the model.

Because now when the model is loaded, TF will be able to reinstantiate the same layer from config.


Layer.from_config's source code may give a better sense of how it works:

@classmethod
def from_config(cls, config):
  return cls(**config)

Upvotes: 88

Related Questions