nairouz mrabah
nairouz mrabah

Reputation: 1217

Shared Layers, Different Models

I have two Keras models (functional API) sharing some layers. I m wondering if I train the first model, will the second one get its shared layers' weights updated automatically or should I load the weights manually.

I know from the documentation that layers could be shared within the same model but I don't have any clue for this particular situation.

I am also wondering whether Keras models with shared layers are sharing the same computational graph or they have independent ones.

Upvotes: 3

Views: 2023

Answers (1)

rvinas
rvinas

Reputation: 11895

When you train the first model, the weights from the shared layers will be updated automatically in every other model. Consider the following example:

x = Input(shape=(input_dim,))
encoder = Dense(output_dim)(x)
decoder = Dense(input_dim)(encoder)

autoencoder = Model(input=x, output=decoder)
supervised = Model(input=x, output=encoder)

autoencoder.compile(...)
supervised.compile(...)

Here, when you train supervised, the weights from layer encoder are updated both for supervised and autoencoder. In other words, the weights from encoder belong exclusively to this layer, and they don't depend on the models that use this layer.

For your second question, the answer is that Keras uses only one computational graph (even when the models don't share layers).

Upvotes: 6

Related Questions