Pawel
Pawel

Reputation: 169

TensorFlow: Softmax applied to each entry

I have a tensor x of type tf.float32 and I would like to apply softmax over all entries. Unfortunately, the built-in function can apply softmax along a specified axis only.

The solution I thought of:

e = tf.exp(x)
softmaxed = e / tf.reduce_sum(e)

does not work - if x has too big entries (e.g. 100), then e cannot be calculated properly.

Upvotes: 0

Views: 126

Answers (1)

dtward
dtward

Reputation: 869

Because softmax(x) = softmax(x-c) for any constant c (exp(-c) factors out of all the exponentials, and cancels between numerator and demoninator), you can apply the softmax in a numerically stable way by subtracting an appropriate constant. Subtracting the max over all your entries means all your exponentials will be between 0 and 1, and the result can be computed stably. Give this a try:

max_x = tf.reduce_max(x)
e = tf.exp(x-max_x)
softmaxed = e / tf.reduce_sum(e)

Upvotes: 1

Related Questions