Yurui Ming
Yurui Ming

Reputation: 21

what's the purpose of tf.nn.dropout?

I notice there are two APIs in TensorFlow concerning with dropout, one is tf.nn.dropout, the other is tf.layers.dropout. I just wonder what's the purpose of tf.nn.dropout? According to https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf, there should be a parameter to distinguish between training and testing stage. I see tf.layers.dropout provides the proper behavior, so why another function tf.nn.dropout? Anyone has any idea? Thanks.

Upvotes: 0

Views: 368

Answers (1)

Mehul Jain
Mehul Jain

Reputation: 491

tf.layers.dropout uses tf.nn.dropout function internally.

tf.nn.dropout might be useful if you just want to use a higher level abstraction and do not want to control many facets of the dropout.

Look at the api docs: 1)https://www.tensorflow.org/api_docs/python/tf/layers/dropout

2)https://www.tensorflow.org/api_docs/python/tf/nn/dropout

tf.layers.dropout is a wrapper around tf.nn.dropout and there's a slight difference in terms that tf.layers uses "rate of dropout" while tf.nn "uses the probability to keep the inputs". Though a direct relation can be established between them.

Also there's an extra argument "Training" in tf.layers.dropout which is used to control Whether to return the output in training mode (apply dropout) or in inference mode (return the input untouched).

Upvotes: 1

Related Questions