Reputation: 3207
The syntax of the usage is clear:
decay = tf.constant(0.001, dtype=tf.float32)
w = tf.get_variable(name='weight', shape=[512, 512],
regularizer=tf.contrib.layers.l2_regularizer(decay))
However, in the documentation only the following is stated:
regularizer
: A (Tensor -> Tensor or None) function; the result of applying it on a newly created variable will be added to the collectiontf.GraphKeys.REGULARIZATION_LOSSES
and can be used for regularization.
The above does not imply that the regularization loss is automatically minimized. So do we need to manually get the variable from the collection tf.GraphKeys.REGULARIZATION_LOSSES
and add it to our main loss in order for it to be applied?
Upvotes: 3
Views: 411
Reputation: 19123
So do we need to manually get the variable from the collection tf.GraphKeys.REGULARIZATION_LOSSES and add it to our main loss in order for it to be applied?
Yes, and no: You need to manually get the regularization losses via tf.losses.get_regularization_loss()
(this will already get all regularization losses defined in the collection, no need to search for the variables in it), then you simply add the regularization loss to your model's loss and use that as the loss your optimizer trains on:
logits = model_fn(inputs)
model_loss = your_chosen_loss_function(logits)
regularization_loss = tf.losses.get_regularization_loss()
your_chosen_optimizer.minimize(model_loss + regularization_loss)
Upvotes: 2