Niranjan Behera
Niranjan Behera

Reputation: 21

Unable to convert Tensorflow from 1.0 to Tensorflow 2.0

I have tensorflow 1.0 version code and unable to convert tensorflow 2.0 using below syntax. Could you please help me out ?

A)

lstm_cell =tf.keras.layers.LSTM(units=hidden_unit)
#lstm_cell = tf.compat.v1.nn.rnn_cell.DropoutWrapper(lstm_cell, output_keep_prob=self.dropout_keep_prob)

Q -1) how to use drop out for the lstm_cell on Tf2.0?

B)

self._initial_state = lstm_cell.zero_state(self.batch_size, tf.float32)

Q-2 ) when I use above syntax,am getting an error "LSTM cell does not have zero_state cell for TF2.0"

How to initialize lSTM cell?

C) how to use tf.keras.layers.RNN cell for TF2.0

Upvotes: 1

Views: 920

Answers (2)

Niranjan Behera
Niranjan Behera

Reputation: 21

Thank @AlexisBRENON !!! .. Here is my code . Please let me know if I did any mistake .

lstm_cell =tf.keras.layers.LSTM(units=hidden_unit)
    lstm_cell = tf.nn.RNNCellDropoutWrapper(lstm_cell, output_keep_prob=self.dropout_keep_prob)
    self._initial_state = lstm_cell.get_initial_state(self.batch_size, tf.float32)
    inputs = [tf.squeeze(input_, [1]) for input_ in tf.split(pooled_concat,num_or_size_splits=int(reduced),axis=1)]

    outputs, state_size =tf.keras.layers.RNN(lstm_cell, inputs, initial_state=self._initial_state, return_sequences=self.real_len)

    ==>>> Want to  Collect the appropriate last words into variable output (dimension = batch x embedding_size)
    output = outputs[0]

ERROR:- self._initial_state = lstm_cell.get_initial_state(self.batch_size, tf.float32) ValueError: slice index 0 of dimension 0 out of bounds. for 'strided_slice' (op: 'StridedSlice') with input shapes: [0], [1], [1], [1] and with computed input tensors: input[1] = <0>, input[2] = <1>, input[3] = <1>.

Upvotes: 1

AlexisBRENON
AlexisBRENON

Reputation: 3079

For the RNN dropout, the DropoutWrapper has been move to tf.nn.RNNCellDropoutWrapper.

I suppose that tf.keras.layers.LSTMCell.get_initial_state is the new name of zero_state.

You should be more precise on what you want to do with RNNs. tf.keras.layers.RNN is a base class for recurrent layers and should not be used as is. Instead, you should use some sub-classes like SimpleRNN, GRU or LSTM, or make your own sub-class. Take a look at the tutorial on recurrent neural network.

Upvotes: 0

Related Questions