Reputation: 75
Good day, everyone.
I want to have two separate TensorFlow models (f
and g
) and train both of them on the loss of f(g(x)). However, I want to use them separately, like g(x) or f(e), where e is an embedded vector but received not from g.
For example, the classical way to create the model with embedding looks like this:
# First block
model = Sequential([
vectorize_layer,
Embedding(vocab_size, embedding_dim, name="embedding"),
GlobalAveragePooling1D(),
Dense(16, activation='relu'),
Dense(1)
])
I want to have an option to pass data through embedding layer and all other layers separately, but still train the model as a whole unit, like:
# Second block
g = Sequential([
vectorize_layer,
Embedding(vocab_size, embedding_dim, name="embedding")
])
f = Sequential([
GlobalAveragePooling1D(),
Dense(16, activation='relu'),
Dense(1)
])
Can I do that without going to low-level TensorFlow? The first block of code is exactly what I want to train and I can leave it like that, but I need to pass data through the specific layers somehow.
Upvotes: 4
Views: 396
Reputation: 1687
This can be achieved by weight sharing or shared layers. To share layers in different models in keras, you just need to pass the same instance of layer to both of the models.
Example Codes:
#initialize them beforehand
em_layer=Embedding(vocab_size, embedding_dim, name="embedding")
firstfc=Dense(16, activation='relu')
secondfc=Dense(1)
main_model = Sequential([
vectorize_layer,
em_layer,
GlobalAveragePooling1D(),
firstfc,
secondfc
])
g = Sequential([
vectorize_layer,
em_layer
])
f = Sequential([
GlobalAveragePooling1D(),
firstfc,
secondfc
])
Training using f.fit()
/g.fit()
will be reflected on main_model
and vice versa.
Upvotes: 2