Akiiino
Akiiino

Reputation: 1100

How can I reuse a "composite" Keras layer?

So, I have this small helper function:

def ResConv(input, size):
    return BatchNormalization()(Add()([
        GLU()(Conv1D(size*2, 5, padding='causal',)(input)),
        input
    ]))

It creates a specific sequence of layers to be used together; it's pretty clear.

However, now I realize that I need to reuse the same layer on different inputs; that is, I need to have something like this

my_res_conv = ResConv(100)
layer_a = my_res_conv(input_a)
layer_b = my_res_conv(input_b)
concat = concatenate([layer_a, layer_b])

and have layer_a and layer_b share weights.

How can I do this? Do I have to write a custom layer? I never did it before, and I'm not sure on how to approach this situation.

Upvotes: 1

Views: 553

Answers (1)

Akiiino
Akiiino

Reputation: 1100

I ended up actually making a custom class like this:

class ResConv():
    def __init__(self, size):
        self.conv = Conv1D(size*2, 5, padding='causal')
        self.batchnorm = BatchNormalization()
        super(ResConv, self).__init__()

    def __call__(self, inputs):
        return self.batchnorm(Add()([
            GLU()(self.conv(inputs)),
            inputs
        ]))

Basically, you initialize your layers in the __init__, and write the whole computation sequence in __call__; this way your class reapplies the same layers to new inputs every time you call it.

Upvotes: 2

Related Questions