Reputation: 4302
In what scenario is maximizing of information gain not equivalent to minimizing of entropy? The broader question is why do we need the concept of information gain? Is it not sufficient to work only with entropy to decide the next optimal attribute of a decision tree?
Upvotes: 2
Views: 2800
Reputation:
Maximizing the IG (also known as Mutual Information) tends to provide the same result as minimizing the entropy.
Basically, if you minimize the entropy you're forcing Information Gain to be maximum.
Upvotes: 0