Make42
Make42

Reputation: 13118

Neural networks: Combining ReLus and Batch normalization?

In https://datascience.stackexchange.com/questions/14352/how-are-deep-learning-nns-different-now-2016-from-the-ones-i-studied-just-4-ye I was told that one should use Batch normalization:

It's been known for a while that NNs train best on data that is normalized --- i.e., there is zero mean and unit variance.

I was also told one should use ReLu neurons - especially if the inputs are images. Images usually have numbers between 0 and 1 or 0 and 255.

Question: Is it wise to combine ReLus with Batch normalizsation?

I would imagine that if I do first Batch normalization, I fear, one might loose information once it passes the ReLus.of have of my information once

Upvotes: 0

Views: 93

Answers (1)

Dr. Snoopy
Dr. Snoopy

Reputation: 56407

There is no problem by combining Batch Normalization with ReLUs, this is done very often with no problems. For example the first paper about Residual Networks does this and obtains very good results in ImageNet classification.

Upvotes: 2

Related Questions