Uwe.Schneider
Uwe.Schneider

Reputation: 1415

Implementing an l2 loss into a tensorflow Sequential regression model

I created a keras- tensorflow model, much influenced by this guide which looks like

import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
import time 
import numpy as np
import sys
from keras import losses


model = keras.Sequential()
model.add(layers.Dense(nodes,activation = tf.keras.activations.relu, input_shape=[len(data_initial.keys())]))
model.add(layers.Dense(64,activation = tf.keras.activations.relu))
model.add(layers.Dropout(0.1, noise_shape=None))
model.add(layers.Dense(1))

model.compile(loss='mse',    # <-------- Here we define the loss function 
              optimizer=tf.keras.optimizers.Adam(lr= 0.01,
                                                beta_1 = 0.01,
                                                beta_2 = 0.001,
                                                epsilon= 0.03),
                                                metrics=['mae', 'mse'])
model.fit(train_data,train_labels,epochs = 200)

It is a regression model and instead of the loss = 'mse' I would like to use tf keras mse loss together with an L2 regularization term. The question is

Upvotes: 2

Views: 4847

Answers (1)

mujjiga
mujjiga

Reputation: 16896

You can add regularization as either a layer parameter or as a layer.

Use it as a layer parameter looks like below

model.add(layers.Dense(8, 
          kernel_regularizer=regularizers.l2(0.01),
          activity_regularizer=regularizers.l1(0.01)))

Sample code with first dense layer regularized and a custom loss function

import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
import time 
import numpy as np
import sys
from keras import losses
from keras import regularizers
import keras.backend as K


model = keras.Sequential()
model.add(layers.Dense(8,activation = tf.keras.activations.relu, input_shape=(8,), 
                       kernel_regularizer=regularizers.l2(0.01), 
                       activity_regularizer=regularizers.l1(0.01)))

model.add(layers.Dense(4,activation = tf.keras.activations.relu))
model.add(layers.Dropout(0.1, noise_shape=None))
model.add(layers.Dense(1))


def custom_loss(y_true, y_pred):
    return K.mean(y_true - y_pred)**2

model.compile(loss=custom_loss,
              optimizer=tf.keras.optimizers.Adam(lr= 0.01,
                                                beta_1 = 0.01,
                                                beta_2 = 0.001,
                                                epsilon= 0.03),
                                                metrics=['mae', 'mse'])

model.fit(np.random.randn(10,8),np.random.randn(10,1),epochs = 1)

Upvotes: 2

Related Questions