Amin
Amin

Reputation: 187

How can i use "leaky_relu" as an activation in Tensorflow "tf.layers.dense"?

Using Tensorflow 1.5, I am trying to add leaky_relu activation to the output of a dense layer while I am able to change the alpha of leaky_relu (check here). I know I can do it as follows:

output = tf.layers.dense(input, n_units)
output = tf.nn.leaky_relu(output, alpha=0.01)

I was wondering if there is a way to write this in one line as we can do for relu:

ouput = tf.layers.dense(input, n_units, activation=tf.nn.relu)

I tried the following but I get an error:

output = tf.layers.dense(input, n_units, activation=tf.nn.leaky_relu(alpha=0.01))
TypeError: leaky_relu() missing 1 required positional argument: 'features'

Is there a way to do this?

Upvotes: 13

Views: 34084

Answers (5)

Adi Shumely
Adi Shumely

Reputation: 397

this works for me

from tensorflow.keras.layers import LeakyReLU
 
output = tf.layers.dense(input, n_units, activation=LeakyReLU(alpha=0.01))

Upvotes: 2

Katsuya
Katsuya

Reputation: 151

At least on TensorFlow of version 2.3.0.dev20200515, LeakyReLU activation with arbitrary alpha parameter can be used as an activation parameter of the Dense layers:

output = tf.keras.layers.Dense(n_units, activation=tf.keras.layers.LeakyReLU(alpha=0.01))(x)

LeakyReLU activation works as:

LeakyReLU math expression

LeakyReLU graph

More information: Wikipedia - Rectifier (neural networks)

Upvotes: 14

Philip Egger
Philip Egger

Reputation: 326

I wanted to do something similar in tensorflow 2.0 and I used lambda notation, as in

output = tf.layers.dense(input, n_units, activation=lambda x : tf.nn.leaky_relu(x, alpha=0.01))

Could be a good way to fit it all in one line.

Upvotes: 2

domochevski
domochevski

Reputation: 553

If you're really adamant about a one liner for this, you could use the partial() method from the functools module, as follow:

import tensorflow as tf
from functools import partial

output = tf.layers.dense(input, n_units, activation=partial(tf.nn.leaky_relu, alpha=0.01))

It should be noted that partial() does not work for all operations and you might have to try your luck with partialmethod() from the same module.

Hope this helps you in your endeavour.

Upvotes: 16

Jonas Adler
Jonas Adler

Reputation: 10759

You are trying to do partial evaluation, and the easiest way for you to do this is to define a new function and use it

def my_leaky_relu(x):
    return tf.nn.leaky_relu(x, alpha=0.01)

and then you can run

output = tf.layers.dense(input, n_units, activation=my_leaky_relu)

Upvotes: 5

Related Questions