Reputation: 727
I try to participate in my first Kaggle competition where RMSLE
is given as the required loss function. For I have found nothing how to implement this loss function
I tried to settle for RMSE
. I know this was part of Keras
in the past, is there any way to use it in the latest version, maybe with a customized function via backend
?
This is the NN I designed:
from keras.models import Sequential
from keras.layers.core import Dense , Dropout
from keras import regularizers
model = Sequential()
model.add(Dense(units = 128, kernel_initializer = "uniform", activation = "relu", input_dim = 28,activity_regularizer = regularizers.l2(0.01)))
model.add(Dropout(rate = 0.2))
model.add(Dense(units = 128, kernel_initializer = "uniform", activation = "relu"))
model.add(Dropout(rate = 0.2))
model.add(Dense(units = 1, kernel_initializer = "uniform", activation = "relu"))
model.compile(optimizer = "rmsprop", loss = "root_mean_squared_error")#, metrics =["accuracy"])
model.fit(train_set, label_log, batch_size = 32, epochs = 50, validation_split = 0.15)
I tried a customized root_mean_squared_error
function I found on GitHub but for all I know the syntax is not what is required. I think the y_true
and the y_pred
would have to be defined before passed to the return but I have no idea how exactly, I just started with programming in python and I am really not that good in math...
from keras import backend as K
def root_mean_squared_error(y_true, y_pred):
return K.sqrt(K.mean(K.square(y_pred - y_true), axis=-1))
I receive the following error with this function:
ValueError: ('Unknown loss function', ':root_mean_squared_error')
Thanks for your ideas, I appreciate every help!
Upvotes: 47
Views: 87083
Reputation: 61
Just like before, but more simplified (directly) version for RMSLE using Keras Backend:
import tensorflow as tf
import tensorflow.keras.backend as K
def root_mean_squared_log_error(y_true, y_pred):
msle = tf.keras.losses.MeanSquaredLogarithmicError()
return K.sqrt(msle(y_true, y_pred))
Upvotes: 4
Reputation: 56
You can do RMSLE the same way RMSE is shown in the other answers, you just also need to incorporate the log function:
from tensorflow.keras import backend as K
def root_mean_squared_log_error(y_true, y_pred):
return K.sqrt(K.mean(K.square(K.log(1+y_pred) - K.log(1+y_true))))
Upvotes: 3
Reputation: 56397
When you use a custom loss, you need to put it without quotes, as you pass the function object, not a string:
def root_mean_squared_error(y_true, y_pred):
return K.sqrt(K.mean(K.square(y_pred - y_true)))
model.compile(optimizer = "rmsprop", loss = root_mean_squared_error,
metrics =["accuracy"])
Upvotes: 76
Reputation: 1233
I prefer reusing part of the Keras work
from keras.losses import mean_squared_error
def root_mean_squared_error(y_true, y_pred):
return K.sqrt(mean_squared_error(y_true, y_pred))
model.compile(optimizer = "rmsprop", loss = root_mean_squared_error,
metrics =["accuracy"])
Upvotes: 7
Reputation: 401
If you are using latest tensorflow nightly, although there is no RMSE in the documentation, there is a tf.keras.metrics.RootMeanSquaredError()
in the source code.
sample usage:
model.compile(tf.compat.v1.train.GradientDescentOptimizer(learning_rate),
loss=tf.keras.metrics.mean_squared_error,
metrics=[tf.keras.metrics.RootMeanSquaredError(name='rmse')])
Upvotes: 16
Reputation: 689
The accepted answer contains an error, which leads to that RMSE being actually MAE, as per the following issue:
https://github.com/keras-team/keras/issues/10706
The correct definition should be
def root_mean_squared_error(y_true, y_pred):
return K.sqrt(K.mean(K.square(y_pred - y_true)))
Upvotes: 37