Bomps
Bomps

Reputation: 27

tf.Variable gets converted to normal tensor in loop

I have a custom layer in Keras in which I define some variables to which I assign some values in the call section.

class Mine_layer(KL.Layer):
  def __init__(self,shape):
    self.block=tf.Variable(tf.constant(1,shape=shape))
  def call (self,indeces):
    self.block=self.block[indeces[0][0],indeces[0][1]].assign(1)

this works but if i try to utilize a for loop over all the indeces:

for i in tf.range(0,limit=tf.shape(indeces)[0]):
   self.block=self.block[indeces[i][0],indeces[i][1]].assign(1)

this gives me an error saying that "'Tensor' object has no attribute 'assign'".

Why does this happen? How can I solve it?

I tried looking at the documentation but I still don't get it.

Thanks in advance to anyone who may answer.

Upvotes: 0

Views: 272

Answers (1)

Laplace Ricky
Laplace Ricky

Reputation: 1687

Most of the time, we do not use self.block=... ever again after init because this could have a possibility that make self.block points to another undesirable object like a tensor in your case, resulting the error message.

i.e.

class my_layer(keras.Layer):
  def __init__(self,shape):
    self.block = tf.Variable(tf.ones(shape)) #use tf.ones instead of tf.constant

  def call(self,inputs):
    self.block.scatter_nd_update(inputs,tf.ones(tf.shape(inputs)[0]))
    #remove the self.block= assignment and use scatter_nd_update

class my_layer2(keras.Layer):
  def __init__(self,shape):
    self.var = tf.Variable(tf.ones(shape)) 

  def call(self,inputs):
     for i in tf.range(tf.shape(inputs)[0]):
        self.var[inputs[i,0],inputs[i,1]].assign(1)

Update: Both solutions are working

Upvotes: 1

Related Questions