Clever
Clever

Reputation: 107

How to share weight between two keras layers?

How can I share weight between two Keras layers, e.g. out1 and out2?

inp1 = tf.keras.Input(shape=(100, 200, 3))
inp2 = tf.keras.Input(shape=(400, 800, 3))
out1 = tf.keras.layers.Conv2D(32, 3, strides=(2,2), padding='same', activation='relu', name='1')(inp1)
out2 = tf.keras.layers.Conv2D(32, 3, strides=(2,2), padding='same', activation='relu', name='2')(inp2)

Upvotes: 0

Views: 2754

Answers (1)

today
today

Reputation: 33410

If you want to apply the same convolution layer on inp1 and inp2 tensors, then you just need to first create the layer and then call it on inp1 and inp2:

shared_conv = tf.keras.layers.Conv2D(32, 3, strides=(2,2), padding='same', activation='relu')
out1 = shared_conv(inp1)
out2 = shared_conv(inp2)

See shared layers section in Keras documentation for more information.

Upvotes: 4

Related Questions