Amirfel
Amirfel

Reputation: 77

Keras Sequential model with cRelu activation

I have a problem with creating a Dense model with 3 Layers in which the activation function is cRelu. cRelu concatenates two relu (a negative and a positive) and creates a tensor twice the size in it's output. When trying to add another layer after it, I always get a size mismatch error

model  = Sequential()
model.add(Dense(N, input_dim=K, activation=crelu))
model.add(Dense(N//2, activation=crelu))

How do I tell the next layer to expect a 2N input and to N?

Upvotes: 1

Views: 346

Answers (1)

josoler
josoler

Reputation: 1423

Keras doesn't expect the activation function to change the output shape. If you want to change it, you should wrap the crelu functionality in a layer and specify the corresponding output shape:

import tensorflow as tf
from keras.layers import Layer

class cRelu(Layer):

    def __init__(self, **kwargs):
        super(cRelu, self).__init__(**kwargs)

    def build(self, input_shape):
        super(cRelu, self).build(input_shape)

    def call(self, x):
        return tf.nn.crelu(x)

    def compute_output_shape(self, input_shape):
        """
        All axis of output_shape, except the last one,
        coincide with the input shape.
        The last one is twice the size of the corresponding input 
        as it's the axis along which the two relu get concatenated.
        """
        return (*input_shape[:-1], input_shape[-1]*2)

Then you can use it as follows

model  = Sequential()
model.add(Dense(N, input_dim=K))
model.add(cRelu())
model.add(Dense(N//2))
model.add(cRelu())

Upvotes: 3

Related Questions