David Crook
David Crook

Reputation: 2730

Tensorflow Layers Api Linear Activation Function

This question is similar to this question: How to use a linear activation function in TensorFlow? however not the same.

On the final dense layer I want to output 28 nodes with a linear activation, not a sigmoid. I am using the new layers api as shown here: https://www.tensorflow.org/tutorials/layers

My final layer stack looks like this however:

flat = tf.reshape(pool3, [-1, 128 * 128 * 128]) #width (after poolings), height (after poolings), filters
dense1 = tf.layers.dense(inputs=flat, units=4096, activation=tf.nn.relu)
dense2 = tf.layers.dense(inputs=dense1, units=4096, activation=tf.nn.relu)
dropout = tf.layers.dropout(
            inputs=dense2, rate=0.4, training=mode == learn.ModeKeys.TRAIN)
output = tf.layers.dense(inputs=dropout, units=28)

How does on ensure that the output of the 28 nodes is in fact linear? In CNTK, you specify an activation function as None (see here: cntk linear activation function in layers? )

Pointers is greatly appreciated. Thanks!

Upvotes: 3

Views: 3990

Answers (1)

interjay
interjay

Reputation: 110069

The documentation of dense says about the activation parameter:

activation: Activation function (callable). Set it to None to maintain a linear activation.

None is the default value, so not specifying the activation sets it to linear.

Upvotes: 8

Related Questions