A.M.
A.M.

Reputation: 1797

Batch normalization initializer in TensorFlow

In TensorFlow, batch normalization parameters include beta, gamma, moving mean, and moving variance. However, for initializing these parameters there is only one argument in tf.contrib.layers.batch_norm(*args, **kwargs) called param_initializers which according to the documents it contains optional initializers for beta, gamma, moving mean and moving variance.

How can we use param_initializers to initialize these parameters?

Upvotes: 0

Views: 3055

Answers (1)

Martin Thoma
Martin Thoma

Reputation: 136187

Here is how you use batch normalization with Tensorflow 1.0:

import tensorflow as tf
batch_normalization = tf.layers.batch_normalization

... (define the network)
net = batch_normalization(net)
... (define the network)

If you want to set parameters, just do it like this:

net = batch_normalization(net, 
                          beta_initializer=tf.zeros_initializer(), 
                          moving_variance_initializer=tf.ones_initializer())

*args, **kwargs

This is the python way to pass arbitrary many non-keyword arguments args and arbitrary many keyword arguments kwargs. For example:

def test(*args, **kwargs):
    print("#" * 80)
    print(args)
    print("#" * 80)
    print(kwargs)

test(1, 2, 42, 3.141, 'foo', a=7, b=3, c='bla')

gives

################################################################################
(1, 2, 42, 3.141, 'foo')
################################################################################
{'a': 7, 'c': 'bla', 'b': 3}

Upvotes: 3

Related Questions