Reputation: 601
I have the following model, where keep_features=900 or so,y is one-hot encoding of classes. I am looking for the architecture below though(is that possible with keras, and what would the notation idea look like,specially the parallel part and the concatination)
model = Sequential()
model.add(Dense(keep_features, activation='relu'))
model.add(BatchNormalization())
model.add(Dense(256, activation='relu'))
model.add(BatchNormalization())
model.add(Dense(64, activation='relu'))
model.add(BatchNormalization())
model.add(Dense(3, activation='softmax'))
model.compile(loss=losses.categorical_crossentropy,optimizer='adam',metrics=['mae', 'acc'])
Upvotes: 1
Views: 186
Reputation: 1644
With the chapter "Multi-input and multi-output models" here you can make something like this for your desired model:
K = tf.keras
input1 = K.layers.Input(keep_features_shape)
denseA1 = K.layers.Dense(256, activation='relu')(input1)
denseB1 = K.layers.Dense(256, activation='relu')(input1)
denseC1 = K.layers.Dense(256, activation='relu')(input1)
batchA1 = K.layers.BatchNormalization()(denseA1)
batchB1 = K.layers.BatchNormalization()(denseB1)
batchC1 = K.layers.BatchNormalization()(denseC1)
denseA2 = K.layers.Dense(64, activation='relu')(batchA1)
denseB2 = K.layers.Dense(64, activation='relu')(batchB1)
denseC2 = K.layers.Dense(64, activation='relu')(batchC1)
batchA2 = K.layers.BatchNormalization()(denseA2)
batchB2 = K.layers.BatchNormalization()(denseB2)
batchC2 = K.layers.BatchNormalization()(denseC2)
denseA3 = K.layers.Dense(32, activation='softmax')(batchA2) # individual layer
denseB3 = K.layers.Dense(16, activation='softmax')(batchB2) # individual layer
denseC3 = K.layers.Dense(8, activation='softmax')(batchC2) # individual layer
concat1 = K.layers.Concatenate(axis=-1)([denseA3, denseB3, denseC3])
model = K.Model(inputs=[input1], outputs=[concat1])
model.compile(loss = K.losses.categorical_crossentropy, optimizer='adam', metrics=['mae', 'acc'])
Upvotes: 1