Zardaloop
Zardaloop

Reputation: 1634

weights does not exist, or was not created with tf.get_variable()

I spend days trying to figure out what is going on and I am still getting this error. here is the error I get

ValueError: Variable rnn/multi_rnn_cell/cell_1/basic_lstm_cell/weights does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope?

And here is my sample code, does anyone know what I am doing wrong?

x = tf.placeholder(tf.float32,[None,n_steps,n_input])
y = tf.placeholder(tf.float32,[None,n_classes])
weights = {
    'out': tf.Variable(tf.random_normal([n_hidden, n_classes]))
}
biases = {
    'out': tf.Variable(tf.random_normal([n_classes]))
}

def RNN(x, weights, biases):

    x = tf.unstack(x, n_steps, 1)

    lstm_cell = rnn.MultiRNNCell([cell() for y in range(2)] , state_is_tuple=True)


    # Get lstm cell output
    outputs, states = rnn.static_rnn(lstm_cell, x, dtype=tf.float32)

    # Linear activation, using rnn inner loop last output
    return tf.matmul(outputs[-1], weights['out']) + biases['out']

def cell():        
    return rnn.BasicLSTMCell(n_hidden,forget_bias=0.1, reuse=True)

pred = RNN(x, weights, biases)

Upvotes: 2

Views: 1693

Answers (2)

javidcf
javidcf

Reputation: 59731

If you want to reuse the weights, then the easiest way is to create a single cell object and pass it multiple times to MultiRNNCell:

import tensorflow as tf
from tensorflow.contrib import rnn

n_steps = 20
n_input = 10
n_classes = 5
n_hidden = 15

x = tf.placeholder(tf.float32,[None,n_steps,n_input])
y = tf.placeholder(tf.float32,[None,n_classes])
weights = {
    'in': tf.Variable(tf.random_normal([n_input, n_hidden])),
    'out': tf.Variable(tf.random_normal([n_hidden, n_classes]))
}
biases = {
    'in': tf.Variable(tf.random_normal([n_hidden])),
    'out': tf.Variable(tf.random_normal([n_classes]))
}

def RNN(x, weights, biases):

    # Initial input layer
    inp = (tf.matmul(x, weights['in'][tf.newaxis, ...]) +
           biases['in'][tf.newaxis, tf.newaxis, ...])
    inp = tf.nn.sigmoid(inp)
    inp = tf.unstack(inp, axis=-1)

    my_cell = cell()
    lstm_cell = rnn.MultiRNNCell([my_cell for y in range(2)], state_is_tuple=True)

    # Get lstm cell output
    outputs, states = rnn.static_rnn(lstm_cell, inp, dtype=tf.float32)

    # Linear activation, using rnn inner loop last output
    return tf.matmul(outputs[-1], weights['out']) + biases['out']

def cell():        
    return rnn.BasicLSTMCell(n_hidden,forget_bias=0.1)

pred = RNN(x, weights, biases)

However, you have to make sure that it makes sense to share the variables, dimension-wise, otherwise it will fail. In this case, I have added an additional layer before the LSTM cells to make sure every LSTM input is the same size.

Upvotes: 1

Nipun Wijerathne
Nipun Wijerathne

Reputation: 1829

If you don't need to reuse the cell, just use the following,

def cell():        
    return rnn.BasicLSTMCell(n_hidden,forget_bias=0.1)

Else, If you need to reuse, you can follow this Reuse Reusing Variable of LSTM in Tensorflow post that has a nice explanation.

Upvotes: 2

Related Questions