Pradeep Vairamani
Pradeep Vairamani

Reputation: 4302

Information gain vs minimizing entropy

In what scenario is maximizing of information gain not equivalent to minimizing of entropy? The broader question is why do we need the concept of information gain? Is it not sufficient to work only with entropy to decide the next optimal attribute of a decision tree?

Upvotes: 2

Views: 2800

Answers (1)

user4061624
user4061624

Reputation:

Maximizing the IG (also known as Mutual Information) tends to provide the same result as minimizing the entropy.

Basically, if you minimize the entropy you're forcing Information Gain to be maximum.

Upvotes: 0

Related Questions