Dims
Dims

Reputation: 51199

How to share convolution kernels between layers in keras?

Suppose I want to compare two images with deep convolutional NN. How can I implement two different pathways with the same kernels in keras?

Like this:

enter image description here

I need convolutional layers 1,2 and 3 use and train the same kernels.

Is it possible?

I was also thinking to concatenate images like below

enter image description here

but question is about how to implement tolopology on first picture.

Upvotes: 4

Views: 2899

Answers (2)

Daniel Möller
Daniel Möller

Reputation: 86620

You can use the same layer twice in the model, creating nodes:

from keras.models import Model    
from keras.layers import *

#create the shared layers
layer1 = Conv2D(filters, kernel_size.....)
layer2 = Conv2D(...)    
layer3 = ....

#create one input tensor for each side
input1 = Input((imageX, imageY, channels))
input2 = Input((imageX, imageY, channels))   

#use the layers in side 1
out1 = layer1(input1)   
out1 = layer2(out1)   
out1 = layer3(out1)

#use the layers in side 2
out2 = layer1(input2)   
out2 = layer2(out2)   
out2 = layer3(out2)

#concatenate and add the fully connected layers
out = Concatenate()([out1,out2])
out = Flatten()(out)
out = Dense(...)(out)   
out = Dense(...)(out)   

#create the model taking 2 inputs with one output
model = Model([input1,input2],out)

You could also use the same model twice, making it a submodel of a bigger one:

#have a previously prepared model 
convModel = some model previously prepared

#define two different inputs
input1 = Input((imageX, imageY, channels))
input2 = Input((imageX, imageY, channels))   

#use the model to get two different outputs:
out1 = convModel(input1)
out2 = convModel(input2)

#concatenate the outputs and add the final part of your model: 
out = Concatenate()([out1,out2])
out = Flatten()(out)
out = Dense(...)(out)   
out = Dense(...)(out)   

#create the model taking 2 inputs with one output
model = Model([input1,input2],out)

Upvotes: 8

David
David

Reputation: 116

Indeed using the same (instance of) layer twice ensures that the weights will be shared.

Just look at the siamese example, I just put here an excerpt from the model to show an example:

# because we re-use the same instance `base_network`,
# the weights of the network
# will be shared across the two branches
processed_a = base_network(input_a)
processed_b = base_network(input_b)

Upvotes: 3

Related Questions