Hamza Farooq
Hamza Farooq

Reputation: 25

Understanding the role of layers and activation functions in Keras neural network

What is the role of (28, activation="relu") function, Dropout and softmax in neural network for mnist dataset need proper guidance of each layer this whole code

model = tf.keras.models.Sequential([
      tf.keras.layers.Flatten(input_shape=(28, 28)),
      tf.keras.layers.Dense(128, activation='relu'),
      tf.keras.layers.Dropout(0.2),
      tf.keras.layers.Dense(10, activation='softmax')
    ])

Upvotes: 1

Views: 285

Answers (1)

Strange
Strange

Reputation: 1550

The numbers 128,10 are the number of neurons in each layer of your network. tf.Dense() is used to create layers.

relu, softmax are the activation functions and these activation functions are used to provide non-linearity to the output of a neuron.

The purpose of activation functions is well described here:

https://ai.stackexchange.com/questions/5493/what-is-the-purpose-of-an-activation-function-in-neural-networks.

Dropout layer is used for providing regularization to network and thereby preventing your Neural network from over-fitting. To say simply, by drop-out some neurons are de-activated, so that the interdependency between specific features is removed.

See this: https://medium.com/@amarbudhiraja/https-medium-com-amarbudhiraja-learning-less-to-learn-better-dropout-in-deep-machine-learning-74334da4bfc5

Upvotes: 4

Related Questions