Reputation: 4680
I'm learning Tensorflow these days.
When using random variables, I noticed there are two different versions. One is the API such as tf.random_normal()
and the others are APIs like tf.random_normal_initialize()
. I think those two are doing exactly the same things. The following is an example.
random_normal = tf.random_normal_initializer(0.0, 1.0, seed=0)
a = random_normal([10])
b = tf.random_normal([10], 0.0, 1.0)
with tf.Session() as sess:
print (sess.run(a))
print (sess.run(b))
I have two questions.
Q1. If these two are doing exactly the same thing, why did they make duplicate APIs?
Q2. I assume that class names in Tensorflow start with an upper case letter such as tf.Variable
, and "op" starts with lower cases. Then why tf.random_normal_initializer
starts with a lower case letter while it is actually a class not an op.
Upvotes: 1
Views: 500
Reputation: 3974
Did you have a look at the respective implementations? The initializer (defined here) is a class that internally calls tf.random_normal()
, which is just an operation (defined here). The attributes of the class instance for example can be exported for further use whereas the params of the op can not.
Concerning your second question: Your assumption is right. But the used name is just an alias. In the corresponding file is written
@tf_export("keras.initializers.RandomNormal", "initializers.random_normal",
"random_normal_initializer")
class RandomNormal(Initializer):
"""Initializer that generates tensors with a normal distribution.
You can see that this is consistent with the naming convention.
Hope this helps.
Upvotes: 1