Reputation: 1533
My in put is a set of images and I want to calculate the univariance over my images. But when try to check with numpy, the unit variance should give 1 in the end. What I am doing wrong in my code?
def pre_processing(img_list, zero_mean=True, unit_var=True):
with tf.device('/cpu:0'):
tn_img0 = img_list[0][1]
tn_img1 = img_list[1][1]
t_img = tn_img0
# t_img = tf.concat([tn_img0, tn_img1], axis=0)
rgb_mean, rgb_var = tf.nn.moments(t_img, [0, 1])
if zero_mean:
tn_img0 = tf.subtract(img_list[0][1], rgb_mean)
tn_img1 = tf.subtract(img_list[1][1], rgb_mean)
if unit_var:
tn_img0 = tf.divide(tn_img0, rgb_var)
tn_img1 = tf.divide(tn_img1, rgb_var)
Upvotes: 0
Views: 309
Reputation: 17201
You should divide by the standard deviation
to get a unit variance of your inputs. So change your code to:
tn_img0 = tf.divide(tn_img0, tf.sqrt(rgb_var))
tn_img1 = tf.divide(tn_img1, tf.sqrt(rgb_var))
Upvotes: 1