Reputation: 829
I would like to implement Hierarchical Multiscale LSTM as a Keras layer.
It was published here and implemented in tensorflow here.
My understanding is that there's a way to wrap such a tensorflow object in Keras as a layer. I'm not sure how complicated it is but I think it's feasible. Can you help me how to do it?
Upvotes: 2
Views: 1452
Reputation: 11895
This is usually done by implementing a custom Layer. To be more specific, you should inherit from keras.engine.topology.layer and provide a custom implementation for the following methods (and place the TensorFlow code within them):
build(input_shape)
: this is where you will define your weights. This method must setself.built = True
, which can be done by callingsuper([Layer], self).build()
call(x)
: this is where the layer's logic lives. Unless you want your layer to support masking, you only have to care about the first argument passed to call: the input tensor.compute_output_shape(input_shape)
: in case your layer modifies the shape of its input, you should specify here the shape transformation logic. This allows Keras to do automatic shape inference.
Since you're trying to implement a recurrent layer, it would also be convenient to inherit directly from keras.legacy.layers.recurrent. In this case, you probably do not need to redefine compute_output_shape(input_shape)
. If your layer needs additional arguments, you can pass them to the __init__
method of your custom layer.
Upvotes: 3