Reputation: 1509
AdaBoost need to update weight for different data points. But most machine learning algorithm doesn't consider the weight of data. So is there a common way to implement weight for machine learning algorithm like SVM or neural network?
Upvotes: 0
Views: 649
Reputation: 4101
There are two solutions to it the one which is specific to the classifier used and the generic approach.
You are asking for latter, while the former is imho always to be prefered if a specific method exists. If weighting is not available, you might find something using the search terms cost sensitive The reason is that the algorithms below is providing suboptimal results.
However, in case a weight sensitive training method is not availble. You can resort to weighted sampling. The idea is that you generate a derived training set using sampling with replacment. The probability that a training example appears in your training set is relative to it's weight. E.g., if you have a weighted set
(e1,0.2),(e2,0.4),(e3,0.8)
The probability that e3
is contained in selected as first example is 4/7
.
The problem with this approach, however, is obviouse. To get an exact representation, we would have to generate a trainigset with at least 7
elements. This increases runtime significantly. Nevertheless, for boosting the exactness of the weighting during training is imho not that important.
Upvotes: 1