La Cordillera
La Cordillera

Reputation: 422

How to add several binary classifiers at the end of a MLP with Keras?

Say I have an MLP that looks like:

model = models.Sequential()

model.add(layers.Dense(200, activation = "relu", input_dim=250))
model.add(layers.Dense(100, activation="relu"))
model.add(layers.Dense(75, activation="relu"))
model.add(layers.Dense(50, activation="relu"))
model.add(layers.Dense(17, activation = "softmax"))


model.compile(optimizer = optimizers.Adam(lr=0.001), 
         loss = "categorical_crossentropy", 
         metrics = ['MeanSquaredError', 'AUC' , 'accuracy',tf.keras.metrics.Precision()])

history = model.fit(X_train, y_train, epochs = 100,
               validation_data = (X_val, y_val))

Now I want, at the final layer, to add a binary classifier for each of the 17 classes, rather than having the 17 classes output altogether with the softmax; Meaning that the binary classifiers should all ramify from to the last layer. Is this possible to do in Keras? I am guessing it should be a different type of model, instead of Sequential()?

EDIT:

I understood that I can't use the Sequential, and changed the model so that:

from tensorflow.keras import Input
from tensorflow.keras import Model
from tensorflow.keras.layers import Dense, Dropout

def test_model(layer_in):

    dense1 = Dense(200, activation = "relu") (layer_in)
    drop1 = Dropout(rate=0.02)(dense1)
    dense2 = Dense(100, activation="relu")(drop1)
    drop2 = Dropout(rate=0.02)(dense2)
    dense3 = Dense(75, activation="relu")(drop2)
    drop3 = Dropout(rate=0.02)(dense3)
    dense4 = Dense(50, activation="relu")(drop3)
    drop4 = Dropout(rate=0.01)(dense4)
    out = Dense(17, activation= "softmax")(drop4)
    return out


layer_in = Input(shape=(250,))
layer_out = test_model(layer_in)

model = Model(inputs=layer_in, outputs=layer_out)

plot_model(model, show_shapes=True)

enter image description here

So I guess the end goal is to have 17 binary layers at the end with a sigmoid function each, that are all connected to drop4...

Upvotes: 0

Views: 217

Answers (1)

Varun Singh
Varun Singh

Reputation: 519

In your problem you are trying to use Sequential API to create the Model. There are Limitations to Sequential API, you can just create a layer by layer model. It can't handle multiple inputs/outputs. It can't be used for Branching also.

Below is the text from Keras official website: https://keras.io/guides/functional_api/

The functional API makes it easy to manipulate multiple inputs and outputs. This cannot be handled with the Sequential API.

Also this stack link will be useful for you: Keras' Sequential vs Functional API for Multi-Task Learning Neural Network

Now you can create a Model using Functional API or Model Sub Classing.

In case of functional API Your Model will be

Assuming Output_1 is classification with 17 classes Output_2 is calssification with 2 classes and Output_3 is regression

input_layer=Input(shape=(250))
x=Dense(200, activation = "relu")(input_layer)
x=Dense(100, activation = "relu")(x)
x=Dense(75, activation = "relu")(x)
x=Dense(50, activation = "relu")(x)
output_1=Dense(17, activation = "softmax",name='output_1')(x)
output_2=Dense(3, activation = "softmax",name='output_2')(x)
output_3=Dense(1,name='output_3')(x)
model=Model(inputs=input_layer,outputs=[output_1,output_2,output_3])
model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=0.01), 
              loss = {'output_1' : tf.keras.losses.CategoricalCrossentropy(),
                      'output_2' : tf.keras.losses.CategoricalCrossentropy(),
                      'output_3' : "mse"
                     },
              metrics = {'output_1' :'accuracy',
                         'output_2': 'accuracy',
                         'output_3' : tf.keras.metrics.RootMeanSquaredError()
                       }
             )

Update Below is the code assuming that you have 6 classes You can just extend the same for 17 classes

input_layer=Input(shape=(250))
x=Dense(200, activation = "relu")(input_layer)
x=Dense(100, activation = "relu")(x)
x=Dense(75, activation = "relu")(x)
x=Dense(50, activation = "relu")(x)
output_1=Dense(1,activation='softmax', name='output_1')(x)
output_2=Dense(1,activation='softmax',name='output_2' )(x)
output_3=Dense(1,activation='softmax',name='output_3')(x)
output_4=Dense(1,activation='softmax', name='output_4')(x)
output_5=Dense(1,activation='softmax',name='output_5' )(x)
output_6=Dense(1,activation='softmax',name='output_6')(x)
model=Model(inputs=input_layer,outputs=[output_1,output_2,output_3])
model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=0.01), 
              loss = {'output_1' : tf.keras.losses.SparseCategoricalCrossentropy(),
                         'output_2':  tf.keras.losses.SparseCategoricalCrossentropy(),
                         'output_3' : tf.keras.losses.SparseCategoricalCrossentropy(),
                         'output_4' :tf.keras.losses.SparseCategoricalCrossentropy(),
                         'output_5' :tf.keras.losses.SparseCategoricalCrossentropy(),
                         'output_6' :tf.keras.losses.SparseCategoricalCrossentropy()
                     },
              metrics = {'output_1' : 'accuracy',
                         'output_2':  'accuracy',
                         'output_3' : 'accuracy',
                         'output_4' :'accuracy',
                         'output_5' :'accuracy',
                         'output_6' :'accuracy'
                       }
             )

Upvotes: 0

Related Questions