Reputation: 1
I’m working with two similar inception v3 models, one trained to predict age and a second trained to predict gender. Both have identical weights up to the layer 250. I need to merge them to reduce size and processing time.
In other words, the output of layer 250 will go in to the input of layer 251 for age and layer 251 for gender and at the end i will get 2 different predictions.
I have achieved it only with the classification section, but it doesn’t work with the hidden layers.
Upvotes: 0
Views: 635
Reputation: 5412
I need to merge them to reduce size.
There is no easy way to do that even though the network architectures for both models are identical, since they have trained for different classification, their parameters are different. If you use layers from age prediction model for gender classification model, this is not going to work for you. Layers in deep-learning neural nets are highly coupled(It is one of limitations of deep learning). One thing you can do is, take layers you want to share for both age and gender, and train on both age and gender dataset. After training on both dataset, you can freeze these shared layers and train the rest of layer for both age and gender(this could be done successively) classification.
Keras has functionalities for both sharing and freezing layers.
input = Input(input_shape)
shared_layer = Dense(num_units1)(input)
output_one = Dense(num_units2)(shared_layer)
output_two = Dense(num_units3)(shared_layer)
model = Model(inputs=input,outputs=[output_one,output_two])
input1 = Input(input1_shape)
input2 = Input(input2_shape)
merged_layer = keras.layers.concatenate([input1,input2])
model = Model(inputs=[input1,input2],outputs=merged_layer)
x
number of layersfor i in range(x):
model.layers[x].trainable = False
Upvotes: 1