Bilal
Bilal

Reputation: 115

weight update between shared variables in Tensorflow

I have shared weights between two layers by the use of reuse property but I found that this actually shares the initial values of the variable. Optimizer updates the weights but that update is not reflected in the other layer.

with tf.variable_scope('conv1'): 
layer1 = conv2d(x,output_features=out_features,kernel_size=kernel_size,padding=padding,strides=strides)

with tf.variable_scope('conv1', reuse=True):
        layer1 = conv2d(x_shaped,output_features=out_features,kernel_size=kernel_size,padding=padding,strides=strides)

how can I reflect the updates on both of the layers?

Upvotes: 1

Views: 130

Answers (1)

Sorin
Sorin

Reputation: 11968

Give conv2d a name. By default the code will figure out a name so that it doesn't overlap, which causes 2 different variables to be created.

There should be a single set of variables. If you see two, then it's they are not shared.

Other than that, you are using variable_scope correctly. There's also tf.AUTO_REUSE if you want to wrap it in a function that you can reuse.

Upvotes: 1

Related Questions