Brian16446
Brian16446

Reputation: 149

Keras add normalise layer so sum of values is 1

I want to be able to add a layer to my network that takes the input from previous layer and outputs a probability distribution where all of the values are positive and sum to 1. So any negative values are set to 0, then the remaining positive values are normalised so that the sum of the outputs = 1.

How can I do this?

Upvotes: 0

Views: 442

Answers (1)

AloneTogether
AloneTogether

Reputation: 26698

IIUC, you can just use the relu and softmax activation functions for that:

import tensorflow as tf

inputs = tf.keras.layers.Input((5,))
x = tf.keras.layers.Dense(32, activation='relu')(inputs)
outputs = tf.keras.layers.Dense(32, activation='softmax')(x)
model = tf.keras.Model(inputs, outputs)

x = tf.random.normal((1, 5))
print(model(x))
print(tf.reduce_sum(model(x)))
tf.Tensor(
[[0.02258478 0.0218816  0.03778725 0.02707791 0.02791201 0.01847759
  0.03252319 0.02181962 0.02726094 0.02221758 0.02674739 0.03611234
  0.02821671 0.02606457 0.04022215 0.02933712 0.02975486 0.036876
  0.04303711 0.03443421 0.03356075 0.03135845 0.03266712 0.03934086
  0.02475732 0.04486758 0.02205345 0.0416355  0.04394628 0.03109134
  0.03432642 0.03004995]], shape=(1, 32), dtype=float32)
tf.Tensor(1.0, shape=(), dtype=float32)

So, if x is the output of your previous layer, you can just run:

x = tf.nn.relu(x)
x = tf.nn.softmax(x)

Upvotes: 1

Related Questions