GoTN
GoTN

Reputation: 135

Keras Custom Layer to Change Output Labels

I want to write a custom layer in Keras which modifies both input as well as the corresponding output labels of the network. Is it possible to modify the output labels?

I already wrote the custom layer which changes the input. Now, I need to modify the output labels of the network accordingly. I have no idea how to achieve this.

Here is an example of what I have/want:

from keras import layers
import numpy as np

class MyLayer(layers.Layer):
    def call(self, x):
        a = np.random.randint(0, 5)
        new_x = x + a
        new_y = y + a # <-- how can I do this?
        return new_x

Is it even possible to modify the output labels during the training?

The expected result would be the following:

X = [[1, 2, 3], [4, 5, 6]]
Y = [1, 2]
a = 2
X = [[3, 4, 5], [6, 7, 8]]
Y = [3, 4]

Upvotes: 2

Views: 979

Answers (1)

Stewart_R
Stewart_R

Reputation: 14485

The short answer is probably "No, not easily - and you probably dont want to do that anyway".

The longer answer:

Consider what happens during training. With huge oversimplification, we can say we do something vaguely like:

y_pred = model(x_train)
loss = compute_loss(y_pred, y_train)
back_propagate_and_update_gradients_through_model(loss_operations)

then, once we have trained our model, at inference we do somethign like:

predictions = model(x_new)

Remember at inference we don't have access to any labels! It therefore doesn't really make much sense for the model to wrangle any labels.

It would be much better (and quite commonplace) to do any label wrangling either in the pre-processing pipeline or, at a pinch, in a custom loss function:

def my_loss(y_true, y_pred):
   y_pred_wrangled = ## do your wrangling here
   return tf.keras.losses.{your_prefered_loss_fn}(y_true, y_pred_wrangled )

Upvotes: 1

Related Questions