Reputation: 3331
I see this "tf.nn.relu" documented here: https://www.tensorflow.org/api_docs/python/tf/nn/relu
But then I also see usage of tf.contrib.layers.relu on this page in "model_fn": https://www.tensorflow.org/extend/estimators
It seems like the latter isn't described like the first one in an API-like fashion, but only presented in use.
Why is this? Are the docs out of date? Why have two - is one old and no longer supported/going to be removed?
Upvotes: 3
Views: 3583
Reputation: 113
During the 2017 TensorFlow Dev Summit, the tf.contrib
section of the TensorFlow project was described as a testing ground for higher level functions. These functions are for the community to use and test. However, there is no guarantee that there won't be changes to the interface when functions are moved to tf.core. Between Tensorflow versions r0.12 at r1.0, many of the tf.contrib.layer
functions were moved to tf.layers
(which did not exist before r1.0). In short, documentation of tf.contrib will never by as good as tf.core.
Upvotes: 2
Reputation: 78546
They are not the same thing.
The latter is not an activation function but a fully_connected
layer that has its activation function preset as nn.relu
:
relu = functools.partial(fully_connected, activation_fn=nn.relu)
# ^ |< >|
# |_ tf.contrib.layers.relu tf.nn.relu_|
If you read the docs for contrib.layers
, you'll find:
Aliases for
fully_connected
which set a default activation function are available:relu
,relu6
andlinear
.
Summarily, tf.contrib.layers.relu
is an alias for a fully_connected
layer with relu activation while tf.nn.relu
is the REctified Linear Unit activation function itself.
Upvotes: 5