Reputation: 1395
I am learning tensorflow, I picked up the following code from the tensorflow website. According to my understanding, axis=0 is for rows and axis=1 is for columns.
How are they getting output mentioned in comments? I have mentioned output according to my thinking against ##.
import tensorflow as tf
x = tf.constant([[1, 1, 1], [1, 1, 1]])
tf.reduce_sum(x, 0) # [2, 2, 2] ## [3, 3]
tf.reduce_sum(x, 1) # [3, 3] ##[2, 2, 2]
tf.reduce_sum(x, [0, 1]) # 6 ## Didn't understand at all.
Upvotes: 52
Views: 65858
Reputation: 1
If you know R, reduce sum is the equivalent of rowSum and colSum in R, with the ability to do both simultaneously if you give both axes in the second parameter.
Upvotes: -1
Reputation: 168
x has 2 rows and 3 columns such that:
1 1 1
1 1 1
Reducing along rows (tf.reduce_sum(x, 0)
) means you are squeezing from bottom and top so that two separate rows become one row. It will become [2,2,2].
Reducing along columns(tf.reduce_sum(x, 1)
) means you are squeezing from right and left so that 3 separate columns become 1 column, i.e [3,3].
Finally tf.reduce_sum(x, [0, 1])
means first you squeeze from bottom and top
(it will become [2,2,2]) and then you squeeze [2,2,2] from right and left so that it will become 6.
Upvotes: 3
Reputation: 2604
Think it like that, the axis indicates the dimension which will be eliminated. So for the first case axis 0
, so if you go through this dimension (2 entries) they will all collapse into 1. Thus it will be as following:
result = [[1,1,1] + [1,1,1]] = [2,2,2]
So you removed dimension 0
.
Now, for the second case, you will collapse axis 1
(or columns), so:
result = [[1,1] + [1,1] + [1,1]] = [3,3]
And the last case is you keep collapsing in order indicated in the brackets. In other words, first you eliminate the rows and then the columns:
result1 = [2,2,2]
result_final = 2 + 2 + 2 = 6
Hope this helps!
Upvotes: 10
Reputation: 2291
tf.reduce_sum(x, [0, 1])
commands will calculate sum across axis = 0 (row-wise) first, then will calculate sum across axis = 1 (column-wise)
For example,
x = tf.constant([[1, 1, 1], [1, 1, 1]])
You are summing into [2,2,2] after calculating sum across axis = 0. You are summing 2 + 2 + 2 after calculating sum across axis = 1.
Finally, getting 6 as output.
Upvotes: 1
Reputation: 355
In order to understand better what is going on I will change the values, and the results are self explanatory
import tensorflow as tf
x = tf.constant([[1, 2, 4], [8, 16, 32]])
a = tf.reduce_sum(x, 0) # [ 9 18 36]
b = tf.reduce_sum(x, 1) # [ 7 56]
c = tf.reduce_sum(x, [0, 1]) # 63
with tf.Session() as sess:
output_a = sess.run(a)
print(output_a)
output_b = sess.run(b)
print(output_b)
output_c = sess.run(c)
print(output_c)
Upvotes: 25
Reputation: 53768
The input is a 2-D tensor:
1 1 1
1 1 1
The 0 axis in tensorflow is the rows, 1 axis is the columns. The sum along the 0 axis will produce a 1-D tensor of length 3
, each element is a per-column sum. The result is thus [2, 2, 2]
. Likewise for the rows.
The sum along both axes is, in this case, the sum of all values in the tensor, which is 6
.
Comparison to numpy:
a = np.array([[1, 1, 1], [1, 1, 1]])
np.sum(a, axis=0) # [2 2 2]
np.sum(a, axis=1) # [3 3]
np.sum(a, axis=(0, 1)) # 6
As you can see, the output is the same.
Upvotes: 22
Reputation: 3159
x
has a shape of (2, 3)
(two rows and three columns):
1 1 1
1 1 1
By doing tf.reduce_sum(x, 0)
the tensor is reduced along the first dimension (rows), so the result is [1, 1, 1] + [1, 1, 1] = [2, 2, 2]
.
By doing tf.reduce_sum(x, 1)
the tensor is reduced along the second dimension (columns), so the result is [1, 1] + [1, 1] + [1, 1] = [3, 3]
.
By doing tf.reduce_sum(x, [0, 1])
the tensor is reduced along BOTH dimensions (rows and columns), so the result is 1 + 1 + 1 + 1 + 1 + 1 = 6
or, equivalently, [1, 1, 1] + [1, 1, 1] = [2, 2, 2]
, and then 2 + 2 + 2 = 6
(reduce along rows, then reduce the resulted array).
Upvotes: 84