User
User

Reputation: 826

Map the output values of a network in 0 or 1 TensorFlow

I have a network with the output layer of size [3, 13000, 3, 1] (B,H,W,C) and I transformed it using tf.reduce_mean to obtain an output size [3, 13000, 1].

Graphically is this enter image description here

Is right?

My labels are in the size of [3, 13000, 1] as my new output and are all values 0 or 1.

Now I have to compute the loss with the labels. To compute this loss I use this formula tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits_v2(logits=predict, labels=y)), but first I have to transform all the values in the output in 0 or 1. I'm using the tf.nn.softmax function but I get all 1.

How can I implement a function that maps all the values under a threshold to 0 and above 1? And this threshold should be for example (max value - min value) / 2. This should also work with the gradient in the backprop step.

Upvotes: 0

Views: 1646

Answers (1)

Vijay Mariappan
Vijay Mariappan

Reputation: 17191

Since your prediction is a single class value, when you apply softmax on it, its going to be always 1 irrespective of the value: (exp(predict)/sum(exp(predict)) = exp(predict)/exp(predict) = 1). Either convert the input to one-hot and make the model predict two classes: [0, 1] or use sigmoid cross entropy instead.

Upvotes: 1

Related Questions