Yalikesi
Yalikesi

Reputation: 103

Setting specific entries to some value in Keras

I have the following Keras model code:

def create_model():
    inputs = layers.Input((32, 32, 20))

    x = layers.Conv2D(filters, 3, padding='same')(inputs)
    x = layers.BatchNormalization()(x)
    x = layers.ReLU()(x)
    
    outputs = layers.Conv2D(6, 1, padding='same')(x)
    outputs = outputs * inputs[..., :1]
    
    model = Model(inputs, outputs)
    return model

I want to set some outputs entries to new values based on inputs with the following code:

outputs[..., 0] = tf.ones_like(inputs[..., 0]) - inputs[..., 0]

However, it throws an error: TypeError: 'KerasTensor' object does not support item assignment. I've also tried using

outputs = outputs[..., 0].assign(tf.ones_like(inputs[..., 0]) - inputs[..., 0])

but it throws a different error: 'KerasTensor' object has no attribute 'assign' (however, it works with ordinary tensorflow tensors). So, is there a way to set some values of outputs to ones in the way I want?

Example of what I want to do (using arrays):

inputs:
[[[0 1 0 0]
  [1 1 1 0]
  [1 0 0 0]]

 [[1 1 0 1]
  [0 1 0 1]
  [1 1 1 0]]]

outputs:
[[[ 0.538 -1.141 -0.483  0.2  ]
  [-0.418  0.087 -0.915  0.433]
  [ 0.434  1.298  1.202  1.13 ]]

 [[ 0.175  1.672  0.769  0.226]
  [ 1.203  0.019  0.107  0.09 ]
  [-0.108  0.145 -0.537  0.213]]]

After outputs = outputs * inputs[..., :1] I get

[[[ 0.    -0.    -0.     0.   ]
  [-0.418  0.087 -0.915  0.433]
  [ 0.434  1.298  1.202  1.13 ]]

 [[ 0.175  1.672  0.769  0.226]
  [ 0.     0.     0.     0.   ]
  [-0.108  0.145 -0.537  0.213]]]

And with outputs[..., 0] = tf.ones_like(inputs[..., 0]) - inputs[..., 0] I want to get

[[[ 1.    -0.    -0.     0.   ]
  [ 0.     0.087 -0.915  0.433]
  [ 0.     1.298  1.202  1.13 ]]

 [[ 0.     1.672  0.769  0.226]
  [ 1.     0.     0.     0.   ]
  [ 0.     0.145 -0.537  0.213]]]

Upvotes: 0

Views: 79

Answers (1)

AloneTogether
AloneTogether

Reputation: 26708

Here is a simple working solution based on tensor_scatter_nd_update and meshgrid. For more information, check out this post. I also introduced a Lambda layer to your model to compute the outputs.

import tensorflow as tf

def compute_output(tensor):
    outputs, inputs = tensor 
    outputs = outputs * inputs[..., :1]

    index_1, index_2, index_3 = tf.meshgrid(tf.range(tf.shape(outputs)[0]), tf.range(tf.shape(outputs)[1]), tf.range(tf.shape(outputs)[2]), indexing='ij')
    index_4 = 0 * tf.cast(tf.ones_like(outputs[..., 0]), dtype=tf.int32)
    indices = tf.stack([index_1, index_2, index_3, index_4], axis=-1)
    return tf.tensor_scatter_nd_update(outputs, indices,  tf.ones_like(inputs[..., 0]) - inputs[..., 0])

def create_model():
    inputs = tf.keras.layers.Input((32, 32, 20))

    x = tf.keras.layers.Conv2D(12, 3, padding='same')(inputs)
    x = tf.keras.layers.BatchNormalization()(x)
    x = tf.keras.layers.ReLU()(x)
    
    outputs = tf.keras.layers.Conv2D(6, 1, padding='same')(x)
    outputs = tf.keras.layers.Lambda(compute_output)((outputs, inputs)) 

    model = tf.keras.Model(inputs, outputs)
    return model

dummy_data = tf.random.normal((1, 32, 32, 20))
model = create_model()
print(model(dummy_data))

Upvotes: 1

Related Questions