Ali Sharifi B.
Ali Sharifi B.

Reputation: 595

Is it necessary to define the backward function for the frozen layers?

I have a function fitting network with 4 hidden layers.

I need to find suitable weights for the first and the third layers, while the second and the fourth layers are some kinds of normalization layers and do not need to be learned, so I just froze them by setting their learning rate to zero.

My question is:

Should I define backward function for those two frozen layers?

I saw in caffe that the pooling layer that does not have any learnable parameter has the backward function.

Thanks in advance,

Upvotes: 1

Views: 161

Answers (1)

lejlot
lejlot

Reputation: 66795

Yes, you need a backward pass, otherwise your learning would stop at this layer (nothing below it will learn). Even for non-learnable layers you need to compute valid gradients.

Upvotes: 2

Related Questions