haxtar
haxtar

Reputation: 2070

softmax cross entropy return value

What does it mean if this is the return value for tf.losses.softmax_cross_entropy_loss?

<tf.Tensor 'softmax_cross_entropy_loss/value:0' shape=() dtype=float32>

Does the fact that is states value:0 mean and shape=() mean that nothing was computed?

Upvotes: 0

Views: 703

Answers (1)

bnorm
bnorm

Reputation: 399

Nothing has been computed because you are displaying the tensors in the graph before any data has been passed through them. Let's say

sce = tf.losses.softmax_cross_entropy_loss(input)

Then to actually get the loss value you have to feed data into it using

sess = tf.Session()
...
loss = sess.run(sce, feed_dict)

where feed_dict is the dictionary for your data. Loss will now return the actual numerical loss value.

value is just an indicator for the group of computations that the value belongs to. For example: tf.reduce_mean returns tf.Tensor 'Mean_1:0' shape=() dtype=float32 because it is a mean calculation. The 0 does not mean its current value is 0, it is just used for indexing.

Additionally, your tensor shape is () because the single loss value doesn't have a batch size, x or y directions, or channels (assuming you are working with 4D tensors) so that is also ok.

Upvotes: 2

Related Questions