Ronan Venkat
Ronan Venkat

Reputation: 355

How to regularize loss function?

I'm learning tensorflow and I'm having some trouble understanding how to regularize the cost function. I've looked and I'm finding a lot of different answers. Could someone please tell me how to regularize the cost function?

I took Andrew Ng's machine learning course on Coursera, and there is one thing that seems to be different when I look on forums. It seems like most people regularize each weight as well as regularizing the final cost function, and on the course there is no mention of that. Which one is correct?

Upvotes: 0

Views: 3833

Answers (2)

Vlad
Vlad

Reputation: 8605

In TensorFlowL2 (Tikhonov) regularization with regularization parameter lambda_could be written like this:

# Assuming you defined a graph, placeholders and logits layer.
# Using cross entropy loss:
lambda_ = 0.1
xentropy = tf.nn.softmax_cross_entropy_with_logits_v2(labels=y, logits=logits)
ys = tf.reduce_mean(xentropy)
l2_norms = [tf.nn.l2_loss(v) for v in tf.trainable_variables()]
l2_norm = tf.reduce_sum(l2_norms)
cost = ys + lambda_*l2_norm
# from here, define optimizer, train operation and train ... :-)

Upvotes: 2

Sharky
Sharky

Reputation: 4543

Basically, you just define regularizer function inside desired layer.

tf.keras.layers.Conv2D(filters,
                       kernel_size,
                       strides=strides,
                       padding=padding,
                       ...
                       kernel_regularizer=tf.keras.regularizers.l2()
                       )

With Estimator API or low level tensorflow you sum all regularizers to your loss value. You can get it with tf.losses.get_regularization_loss() and either just add it to loss or use tf.losses.get_total_loss() Keras will handle it internally.

Upvotes: 0

Related Questions