Reputation: 25
What is the role of (28, activation="relu"
) function, Dropout
and softmax
in neural network for mnist dataset need proper guidance of each layer this whole code
model = tf.keras.models.Sequential([
tf.keras.layers.Flatten(input_shape=(28, 28)),
tf.keras.layers.Dense(128, activation='relu'),
tf.keras.layers.Dropout(0.2),
tf.keras.layers.Dense(10, activation='softmax')
])
Upvotes: 1
Views: 285
Reputation: 1550
The numbers 128,10 are the number of neurons in each layer of your network. tf.Dense() is used to create layers.
relu, softmax are the activation functions and these activation functions are used to provide non-linearity to the output of a neuron.
The purpose of activation functions is well described here:
Dropout layer is used for providing regularization to network and thereby preventing your Neural network from over-fitting. To say simply, by drop-out some neurons are de-activated, so that the interdependency between specific features is removed.
Upvotes: 4