L.Yz
L.Yz

Reputation: 13

How to initial the function "tf.keras.layers.prelu"

I've read some papers about activation funcation PReLU, and the parameter alpha can be initilized to the value 0.25. How to initialize the alpha in function "tf.keras.layers.prelu"?

Upvotes: 1

Views: 1520

Answers (1)

Gokul Thiagarajan
Gokul Thiagarajan

Reputation: 830

You can refer this official document for all parameters.

https://www.tensorflow.org/api_docs/python/tf/keras/layers/PReLU

tf.keras.layers.PReLU(
    alpha_initializer='zeros', alpha_regularizer=None, alpha_constraint=None,
    shared_axes=None, **kwargs
)

To initialise with 0.25, use this

  tf.keras.layers.PReLU(alpha_initializer=tf.initializers.constant(0.25))

Upvotes: 1

Related Questions