林彥良
林彥良

Reputation: 94

Should I initialize the variables by myself or just initialize by global_variable_initialize()?

I'm doing an time series forecasting work, while using the RNNCell in tensorflow it's not like I write the initialize variables by myself, instead it often use the function global_variable_initialize() to do this work.

It feels strange to just call a function and initialize all the variables for me. I've heard that global_variable_initialize() retrieve a list of variables that contains: [all the weights, all the biases, all the hidden state], but it's like a black box for not knowing the variable range, and I don't use a bias when write the network by myself.

For the tensorflow users, do you prefer to initialize variable by yourself(such as using tf.Variable)? Is there any problem or disadvantage for just call global_variable_initialize()?

Hope to know everyone's opinion, thanks!

Upvotes: 1

Views: 245

Answers (2)

nairouz mrabah
nairouz mrabah

Reputation: 1217

I also don't like using the global_variable_initialize. In fact, variable can be efficently and succinctly defined using TF-SLIM.
I m quoting from TensorFlow-Slim documentation.

Creating Variables in native tensorflow requires either a predefined value or an initialization mechanism (e.g. randomly sampled from a Gaussian). Furthermore, if a variable needs to be created on a specific device, such as a GPU, the specification must be made explicit. To alleviate the code required for variable creation, TF-Slim provides a set of thin wrapper functions in variables.py which allow callers to easily define variables.

For example, to create a weights variable, initialize it using a truncated normal distribution, regularize it with an l2_loss and place it on the CPU, one need only declare the following:

weights = slim.variable('weights', shape=[10, 10, 3 , 3],  
                        initializer=tf.truncated_normal_initializer(stddev=0.1),   
                        regularizer=slim.l2_regularizer(0.05), device='/CPU:0')
                      

Upvotes: 0

David Parks
David Parks

Reputation: 32081

Global variables initializer isn't a black box, it's using the GraphKeys collections as described well here:

Usage of tf.GraphKeys

Most operations in tensorflow add variables to the appropriate collection as a matter of convention. The optimizers use these same collections to decide which variables are trainable and not by default. Looking at the appropriate collection.

In my experience most everything is a global variable except for variables created by the tf.metrics.* packages which default to being local variables (so you can reset metrics with the local variables initializer).

Upvotes: 1

Related Questions