user6557479
user6557479

Reputation:

Tensor Flow tutorial logloss implementation

I need to study TF in the express way and i cant understant this part:

cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(y), reduction_indices=[1]))

It's explained with this: First, tf.log computes the logarithm of each element of y. Next, we multiply each element of y_ with the corresponding element of tf.log(y). Then tf.reduce_sum adds the elements in the second dimension of y, due to the reduction_indices=[1] parameter. Finally, tf.reduce_mean computes the mean over all the examples in the batch.

Why it does this manipulations, which are marked bold? why do wee need another dimensiom? Thanks

Upvotes: 1

Views: 1061

Answers (1)

Gregory Begelman
Gregory Begelman

Reputation: 554

There are two dimensions because cross_entropy computes values for a batch of training examples. Therefore, the dimension 0 is for a batch, and dimension 1 is for different classes of a specific example. For example, if there are 3 possible classes and batch size is 2, then y is a 2D tensor of size (2, 3).

Upvotes: 1

Related Questions