Paw in Data
Paw in Data

Reputation: 1604

How to clip layer output in MLP with `tf.keras.activations.relu()`?

According to the documentation, tf.keras.activations.relu(x, alpha=0.0, max_value=None, threshold=0) seems to clip x within [threshold, max_value], but x must be specified. How can I use it for clipping the output of a layer in neural network? Or is there a more convenient way to do so?

Suppose I want to output the linear combination of all elements of a 10-by-10 2D-array only when the result is between 0 and 5.

import tensorflow as tf
from tensorflow import keras

model = keras.models.Sequential()
model.add(keras.layers.Flatten(input_shape=[10, 10]))
model.add(keras.layers.Dense(1, activation='relu')    # output layer

Upvotes: 1

Views: 1096

Answers (1)

Miss Girl
Miss Girl

Reputation: 11

Tensorflow clip_by_value link

For anyone who needs this in the future:

tf.clip_by_value(
    t, clip_value_min, clip_value_max, name=None
)

Upvotes: 0

Related Questions