Wei Liu
Wei Liu

Reputation: 1014

Per pixel softmax for fully convolutional network

I'm trying to implement something like a fully convolutional network, where the last convolution layer uses filter size 1x1 and outputs a 'score' tensor. The score tensor has shape [Batch, height, width, num_classes].

My question is, what function in tensorflow can apply softmax operation for each pixel, independent of other pixels. The tf.nn.softmax ops seems not for such purpose.

If there is no such ops available, I guess I have to write one myself.

Thanks!

UPDATE: if I do have to implement myself, I think I may need to reshape the input tensor to [N, num_claees] where N = Batch x width x height, and apply tf.nn.softmax, then reshape it back. Does it make sense?

Upvotes: 9

Views: 3964

Answers (2)

Apollo
Apollo

Reputation: 153

You can use this function.

I found it by searching from GitHub.

import tensorflow as tf

"""
Multi dimensional softmax,
refer to https://github.com/tensorflow/tensorflow/issues/210
compute softmax along the dimension of target
the native softmax only supports batch_size x dimension
"""
def softmax(target, axis, name=None):
    with tf.name_scope(name, 'softmax', values=[target]):
        max_axis = tf.reduce_max(target, axis, keep_dims=True)
        target_exp = tf.exp(target-max_axis)
        normalize = tf.reduce_sum(target_exp, axis, keep_dims=True)
        softmax = target_exp / normalize
        return softmax

Upvotes: 1

Aaron
Aaron

Reputation: 2364

Reshaping it to 2d and then reshaping it back, like you guessed, is the right approach.

Upvotes: 4

Related Questions