LifeLongLearner
LifeLongLearner

Reputation: 43

Is there any difference between relu as an activation function or a layer?

Is there any difference between relu as an activation function or a layer? For example

Conv2D(filters=8, kernel_size=(3, 3), activation='relu',padding='SAME', name='conv_2')

or

Conv2D(filters=8, kernel_size=(3, 3),padding='SAME', name='conv_2'),
ReLU()

Upvotes: 3

Views: 990

Answers (1)

sgd
sgd

Reputation: 136

No practical difference, except on the latter you could assign/set parameters to the Relu()*. In the first case, I believe it uses the default parameters.

*https://www.tensorflow.org/api_docs/python/tf/keras/layers/ReLU

Upvotes: 2

Related Questions