Reputation: 21
I run ptb_word_lm.py
provided by Tensorflow 1.0, but it shows this message:
ValueError: Attempt to have a second RNNCell use the weights of a variable scope that already has weights: 'Model/RNN/multi_rnn_cell/cell_0/basic_lstm_cell'; and the cell was not constructed as BasicLSTMCell(..., reuse=True). To share the weights of an RNNCell, simply reuse it in your second calculation, or create a new one with the argument reuse=True.
Then I modify the code, add 'reuse=True'
to BasicLSTMCell
, but it show this message:
ValueError: Variable Model/RNN/multi_rnn_cell/cell_0/basic_lstm_cell/weights does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope?
How could I solve this problems?
Upvotes: 2
Views: 3092
Reputation: 463
The following worked for me:
with tf.variable_scope('rnn'):
outputs, final_state = tf.nn.dynamic_rnn(lstm_cell, X_in, initial_state=init_state, time_major=False, scope='rnn')
Upvotes: 1
Reputation: 37
You can trying to add scope='lstmrnn'
in your tf.nn.dynamic_rnn()
function.
Upvotes: 0
Reputation: 2111
Just add the following line on top of your code:
tf.reset_default_graph()
Upvotes: 0
Reputation: 36
It's modify lstm_cell() as follows:
def lstm_cell():
if 'reuse' in inspect.signature(tf.contrib.rnn.BasicLSTMCell.__init__).parameters:
return tf.contrib.rnn.BasicLSTMCell(size, forget_bias=0.0,
state_is_tuple=True,
reuse=tf.get_variable_scope().reuse)
else:
return tf.contrib.rnn.BasicLSTMCell(
size, forget_bias=0.0, state_is_tuple=True)
My Environment:
Windows 10_x64
tensorflow-gpu: 1.1.0
test is OK.
Upvotes: 0
Reputation: 11
Adding reuse = tf.get_variable_scope().reuse
to BasicLSTMCell is OK to me.
Upvotes: 1