patti_jane
patti_jane

Reputation: 3851

Keras: How to concatenate two CNN?

I'm trying to implement the CNN model in this article (https://arxiv.org/abs/1605.07333)

Here, they have two different contexts as inputs which are processed by two independent conv and max-pooling layers. After pooling they concat the results.

CNNs

Assuming each CNN is modelled as such, how do I achieve the model above?

def baseline_cnn(activation='relu'):

model = Sequential()
model.add(Embedding(SAMPLE_SIZE, EMBEDDING_DIMS, input_length=MAX_SMI_LEN))
model.add(Dropout(0.2))
model.add(Conv1D(NUM_FILTERS, FILTER_LENGTH, padding='valid', activation=activation, strides=1))
model.add(GlobalMaxPooling1D())
model.add(Dense(1))
model.add(Activation('sigmoid'))
model.compile(loss='binary_crossentropy', optimizer='adam',  metrics=['accuracy'])

return model

Thanks in advance!

Final Code: I simply used @FernandoOrtega's solution:

def build_combined(FLAGS, NUM_FILTERS, FILTER_LENGTH1, FILTER_LENGTH2):
    Dinput = Input(shape=(FLAGS.max_dlen, FLAGS.dset_size))
    Tinput = Input(shape=(FLAGS.max_tlen, FLAGS.tset_size))


    encode_d= Conv1D(filters=NUM_FILTERS, kernel_size=FILTER_LENGTH1,  activation='relu', padding='valid',  strides=1)(Dinput)
    encode_d = Conv1D(filters=NUM_FILTERS*2, kernel_size=FILTER_LENGTH1,  activation='relu', padding='valid',  strides=1)(encode_d)
    encode_d = GlobalMaxPooling1D()(encode_d)

    encode_tt = Conv1D(filters=NUM_FILTERS, kernel_size=FILTER_LENGTH2,  activation='relu', padding='valid',  strides=1)(Tinput)
    encode_tt = Conv1D(filters=NUM_FILTERS*2, kernel_size=FILTER_LENGTH1,  activation='relu', padding='valid',  strides=1)(encode_tt)
    encode_tt = GlobalMaxPooling1D()(encode_tt)

    encode_combined = keras.layers.concatenate([encode_d, encode_tt])


    # Fully connected 
    FC1 = Dense(1024, activation='relu')(encode_combined)
    FC2 = Dropout(0.1)(FC1)
    FC2 = Dense(512, activation='relu')(FC2)

    predictions = Dense(1, kernel_initializer='normal')(FC2) 

    combinedModel = Model(inputs=[Dinput, Tinput], outputs=[predictions])
    combinedModel.compile(optimizer='adam', loss='mean_squared_error', metrics=[accuracy])

    print(combinedModel.summary())

    return combinedModel

Upvotes: 4

Views: 12341

Answers (1)

Fernando Ortega
Fernando Ortega

Reputation: 735

If you want to concatenate two sub-networks you should use keras.layer.concatenate function.

Furthermore, I recommend you shoud use Functional API as long as it easiest to devise complex networks like yours. For instance:

def baseline_cnn(activation='relu')

    # Defining input 1
    input1 = Embedding(SAMPLE_SIZE, EMBEDDING_DIMS, input_length=MAX_SMI_LEN)
    x1 = Dropout(0.2)(input)
    x1 = Conv1D(NUM_FILTERS, FILTER_LENGTH, padding='valid', activation=activation, strides=1)(x1)
    x1 = GlobalMaxPooling1D()(x1)

    # Defining input 2
    input2 = Embedding(SAMPLE_SIZE, EMBEDDING_DIMS, input_length=MAX_SMI_LEN)
    x2 = Dropout(0.2)(input)
    x2 = Conv1D(NUM_FILTERS, FILTER_LENGTH, padding='valid', activation=activation, strides=1)(x2)
    x2 = GlobalMaxPooling1D()(x2)

    # Merging subnetworks
    x = concatenate([input1, input2])

    # Final Dense layer and compilation
    x = Dense(1, activation='sigmoid')
    model = Model(inputs=[input1, input2], x)
    model.compile(loss='binary_crossentropy', optimizer='adam',  metrics=['accuracy'])

return model

After compile this model, you can fit/evaluate it by means of model.fit([data_split1, data_split2]) in which data_split1 and data_split2 are your different contexts as input.

More info about multi input in Keras documentation: Multi-input and multi-output models.

Upvotes: 3

Related Questions