Reputation: 1158
I want to add new nodes to the output layer to train it later, i'm doing:
def add_outputs(self, n_new_outputs):
out = self.model.get_layer('fc8').output
last_layer = self.model.get_layer('fc7').output
out2 = Dense(n_new_outputs, activation='softmax', name='fc9')(last_layer)
output = merge([out, out2], mode='concat')
self.model = Model(input=self.model.input, output=output)
where 'fc7'
is the fully connected layer before the output layer 'fc8'
. I exect to have just the last layer with out = self.model.get_layer('fc8').output
but the output is all the model.
Is there any way to take just a layer from a network?
Maybe theres other easier way to do it....
Thanks!!!!
Upvotes: 5
Views: 1608
Reputation: 1158
Finally i find a solution:
1) get the weights from the last layer
2) add zeros to the weights and random initialize it's connections
3) pop the output layer and create a new one
4) set new weights to the new layer
here the code:
def add_outputs(self, n_new_outputs):
#Increment the number of outputs
self.n_outputs += n_new_outputs
weights = self.model.get_layer('fc8').get_weights()
#Adding new weights, weights will be 0 and the connections random
shape = weights[0].shape[0]
weights[1] = np.concatenate((weights[1], np.zeros(n_new_outputs)), axis=0)
weights[0] = np.concatenate((weights[0], -0.0001 * np.random.random_sample((shape, n_new_outputs)) + 0.0001), axis=1)
#Deleting the old output layer
self.model.layers.pop()
last_layer = self.model.get_layer('batchnormalization_1').output
#New output layer
out = Dense(self.n_outputs, activation='softmax', name='fc8')(last_layer)
self.model = Model(input=self.model.input, output=out)
#set weights to the layer
self.model.get_layer('fc8').set_weights(weights)
print(weights[0])
Upvotes: 3