John Brandt
John Brandt

Reputation: 71

Keras share weights between custom layers

I am working with the keras-capsnet implementation of Capsule Networks, and am trying to apply the same layer to 30 images per sample.

The weights are initialized within the init and build arguments for the class, shown below. I have successfully shared the weights between the primary routing layers which just use tf.layers.conv2d, where I can assign them the same name and set reuse = True.

Does anyone know how to initialize weights in a Keras custom layer so that they may be reused? I am much more familiar with the tensorflow API than with the Keras one!

def __init__(self, num_capsule, dim_capsule, routings=3,
             kernel_initializer='glorot_uniform',
             **kwargs):
    super(CapsuleLayer, self).__init__(**kwargs)
    self.num_capsule = num_capsule
    self.dim_capsule = dim_capsule
    self.routings = routings
    self.kernel_initializer = initializers.get(kernel_initializer)

def build(self, input_shape):
    assert len(input_shape) >= 3, "The input Tensor should have shape=[None, input_num_capsule, input_dim_capsule]"
    self.input_num_capsule = input_shape[1]
    self.input_dim_capsule = input_shape[2]

    # Weights are initialized here each time the layer is called
    self.W = self.add_weight(shape=[self.num_capsule, self.input_num_capsule,
                                    self.dim_capsule, self.input_dim_capsule],
                             initializer=self.kernel_initializer,
                             name='W')
    self.built = True

Upvotes: 2

Views: 1135

Answers (1)

John Brandt
John Brandt

Reputation: 71

The answer was simple. Set up a layer without calling it on input, and then use that built layer to call the data individually.

Upvotes: 1

Related Questions