jd95
jd95

Reputation: 464

Ways to limit output of NN regression problem in certain limit(i.e. I want my NN to always predict output values only between -20 to +30)

I am training NN for the regression problem. So the output layer has a linear activation function. NN output is supposed to be between -20 to 30. My NN is performing good most of the time. However, sometimes it gives output more than 30 which is not desirable for my system. So does anyone know any activation function that can provide such kind of restriction on output or any suggestions on modifying linear activation function for my application?

I am using Keras with tenserflow backend for this application

Upvotes: 1

Views: 2449

Answers (2)

Thibault Bacqueyrisses
Thibault Bacqueyrisses

Reputation: 2331

What you can do is to activate your last layer with a sigmoid, the result will be between 0 and 1 and then create a custom layer in order to get the desired range :

def get_range(input, maxx, minn):
    return (minn - maxx) * ((input - K.min(input, axis=1))/ (K.max(input, axis=1)*K.min(input, axis=1))) + maxx

and then add this to your network :

out = layers.Lambda(get_range, arguments={'maxx': 30, 'minn': -20})(sigmoid_output)

The output will be normalized between 'maxx' and 'minn'.

UPDATE

If you want to clip your data without normalizing all your outputs, do this instead :

def clip(input, maxx, minn):
    return K.clip(input, minn, maxx)

out = layers.Lambda(clip, arguments={'maxx': 30, 'minn': -20})(sigmoid_output)

Upvotes: 3

Dr. Snoopy
Dr. Snoopy

Reputation: 56357

What you should do is normalize your target outputs to the range [-1, 1] or [0, 1], and then use a tanh (for [-1, 1]) or sigmoid (for [0, 1]) activation at the output, and train the model with normalize data.

Then you can denormalize the predictions to get values in your original ranges during inference.

Upvotes: 3

Related Questions