hh tt
hh tt

Reputation: 405

Concatenate multiple CNN models in keras

I have 8 CNN models model1, model2, model3, model4, model5, model6, model7, model8 each with conv2d, activation, maxpooling, dropout layers. I want to concatenate the output of them, flatten it, finally compile and fit it to be able to classifying purpose, as the figure bellow:

enter image description here

I'm confusing in concatenation,merging and fitting. can I flatten each model alone by using model1.add(Flatten) ,for example, and concatenate them or I must concatenate and flatten all of them? my python code is bellow:

merge = Concatenate([model1, model2, model3, model4, model5, model6, model7, model8])
concat_model = Sequential()
concat_model.add(merge)
concat_model.add(Flatten())
concat_model.add(Dense(128))
concat_model.add(Activation("relu"))
concat_model.add(BatchNormalization())
concat_model.add(Dropout(0.5))

concat_model.add(Dense(classes))
concat_model.add(Activation("softmax"))

concat_model.compile(loss="categorical_crossentropy", optimizer= opt, metrics=["accuracy"])

concat_model.fit_generator(aug.flow(trainX, trainY, batch_size=BS),validation_data=(testX, testY), steps_per_epoch=len(trainX) // BS, epochs=EPOCHS, verbose=1)        

When I run the program, I get the following error:

RuntimeError: You must compile your model before using it.

What is the problem? how can I concatenate, compile, train? can anyone please help me, any information will be helpful.

Upvotes: 3

Views: 10517

Answers (2)

gregory l
gregory l

Reputation: 41

To answer you can't with Keras in Tensorflow 2 to easily generalize the example with 2 models. You cannot concatenate three models without creating an intermediate model. With three models model_1, model_2, model_3 you do this:

# concatenate two models, doesn't three
concat_a = tf.keras.layers.concatenate([model1.output, 
                                      model2.output])
model_a = tf.keras.Model([model1.input, model2.input], concat_a)

concat = tf.keras.layers.concatenate([model_a.output, 
                                        model3.output])

dense = tf.keras.layers.Dense(1024)(concat)
relu = tf.keras.layers.LeakyReLU(alpha=0.3)(dense)
normalize = tf.keras.layers.BatchNormalization()(relu)
out = tf.keras.layers.Dense(10, activation='softmax', name='output_layer')(normalize)

# nested list
model = tf.keras.Model([[model_1.input, model_2.input], model_3.input], out)

model.summary()

optimizer = RMSprop()

model.compile(loss='categorical_crossentropy',
              optimizer=optimizer,
              metrics=['accuracy'])

# simple list
history = model.fit([trainX, trainX, trainX], trainY)

Upvotes: 3

Manoj Mohan
Manoj Mohan

Reputation: 6044

From the documentation, "The Keras functional API is the way to go for defining complex models, such as multi-output models, directed acyclic graphs, or models with shared layers." So, using the functional API is better.

https://keras.io/getting-started/functional-api-guide/#shared-vision-model

https://keras.io/getting-started/functional-api-guide/#visual-question-answering-model

You can Flatten in the individual model and then concatenate as shown in the examples above. In your case, you would finally have something like this.

final_model = Model([input_1, input_2,...input_8], face_probability)

minimal example with two branches:

from keras.layers import Conv2D, MaxPooling2D, Input, Dense, Flatten, concatenate
from keras.models import Model
import numpy as np

digit_a = Input(shape=(27, 27, 1))
x = Conv2D(64, (3, 3))(digit_a)
x = Conv2D(64, (3, 3))(x)
x = MaxPooling2D((2, 2))(x)
out_a = Flatten()(x)

digit_b = Input(shape=(27, 27, 1))
x = Conv2D(64, (3, 3))(digit_b)
x = Conv2D(64, (3, 3))(x)
x = MaxPooling2D((2, 2))(x)
out_b = Flatten()(x)

concatenated = concatenate([out_a, out_b])
out = Dense(1, activation='sigmoid')(concatenated)
model = Model([digit_a, digit_b], out)
print(model.summary())
model.compile('sgd', 'binary_crossentropy', ['accuracy'])
X = [np.zeros((1,27,27,1))] * 2
y = np.ones((1,1))
model.fit(X, y)

Upvotes: 6

Related Questions