TheM00s3
TheM00s3

Reputation: 3711

adjust weights for predicted classes in xgboost in loss function

Is it possible to adjust the weighted error for a given target? What Im trying to do is weight the loss higher for rarer classes when predicting multi-classes.

Upvotes: 0

Views: 3495

Answers (1)

epattaro
epattaro

Reputation: 2428

If using the core data structure you can set the weight of labels through the "set_weight" parameter:

set_weight(weight) Set weight of each instance.

Parameters: weight (array like) – Weight for each data point

While documentation is quite lackluster on that topic, i have found a reasonable answer that might be useful on this previous topic: How is the parameter "weight" (DMatrix) used in the gradient boosting procedure (xgboost)?

quoting it:

Instance Weight File

XGBoost supports providing each instance an weight to differentiate the importance of instances. For example, if we provide an instance weight file for the "train.txt" file in the example as below:

train.txt.weight

1

0.5

0.5

1

0.5

It means that XGBoost will emphasize more on the first and fourth instance, that is to say positive instances while training. The configuration is similar to configuring the group information. If the instance file name is "xxx", XGBoost will check whether there is a file named "xxx.weight" in the same directory and if there is, will use the weights while training models.

hope it helps!

Upvotes: 1

Related Questions