Tim Hsu
Tim Hsu

Reputation: 412

Is it possible to skip the last few layers during back propagation phase?

I am designing a model for machine learning using neural network.

During back propagation phase, I don't want to make the nodes weighting of the last few layers change. Is it mathematically possible?

Upvotes: 0

Views: 272

Answers (2)

Thomas Pinetz
Thomas Pinetz

Reputation: 7148

In keras you can accomplish this by setting the trainable property to false. (https://keras.io/applications/#Fine-tune InceptionV3 on a new set of classes) However I will have to side with the other answer and ask you why you would ever want to do this.

Upvotes: 2

Yuriy Zaletskyy
Yuriy Zaletskyy

Reputation: 5151

I will omit mathematical explanation, just will explain how neural network will behave.

So, let's say that you have neural network with 5 layers, and at last fifth layer weights doesn't change. What will happen in that case?

Error from last layer will be propagated back to previous ( fourth ) layer, but without any changes, and four first layer will have to deal with additional transformation of error which was provided by output layer. Neural network with five layers will learn how to deal with this additional transformation of error. But question, why on earth you need to present this transformation of error?

Upvotes: 0

Related Questions