Reputation: 1797
In TensorFlow, batch normalization parameters include beta
, gamma
, moving mean
, and moving variance
. However, for initializing these parameters there is only one argument in tf.contrib.layers.batch_norm(*args, **kwargs)
called param_initializers
which according to the documents it contains optional initializers for beta
, gamma
, moving mean
and moving variance
.
How can we use param_initializers
to initialize these parameters?
Upvotes: 0
Views: 3055
Reputation: 136187
Here is how you use batch normalization with Tensorflow 1.0:
import tensorflow as tf
batch_normalization = tf.layers.batch_normalization
... (define the network)
net = batch_normalization(net)
... (define the network)
If you want to set parameters, just do it like this:
net = batch_normalization(net,
beta_initializer=tf.zeros_initializer(),
moving_variance_initializer=tf.ones_initializer())
This is the python way to pass arbitrary many non-keyword arguments args
and arbitrary many keyword arguments kwargs
. For example:
def test(*args, **kwargs):
print("#" * 80)
print(args)
print("#" * 80)
print(kwargs)
test(1, 2, 42, 3.141, 'foo', a=7, b=3, c='bla')
gives
################################################################################
(1, 2, 42, 3.141, 'foo')
################################################################################
{'a': 7, 'c': 'bla', 'b': 3}
Upvotes: 3