Peng He
Peng He

Reputation: 2213

XGBoost:What is the parameter 'objective' set?

I want to solve a regression problem with XGBoost. I'm confused with Learning Task parameter objective [ default=reg:linear ](XGboost), **it seems that 'objective' is used for setting loss function.**But I can't understand 'reg:linear' how to influence loss function. In logistic regression demo(XGBoost logistic regression demo), objective = binary:logistic means loss function is logistic loss function.So 'objective=reg:linear' corresponds to which loss function?

Upvotes: 5

Views: 9925

Answers (1)

T. Scharf
T. Scharf

Reputation: 4834

So 'objective=reg:linear' corresponds to which loss function?

Squared error

You can take a look at the loss functions ( which are based on the gradient and hessian ) for both logistic regression and linear regression here

https://github.com/dmlc/xgboost/blob/master/src/objective/regression_obj.cc

Note the loss functions are reasonably similar. Just that the SecondOrderGradient is a constant in square loss

// common regressions
// linear regression
struct LinearSquareLoss {
  static float PredTransform(float x) { return x; }
  static bool CheckLabel(float x) { return true; }
  static float FirstOrderGradient(float predt, float label) { return predt - label; }
  static float SecondOrderGradient(float predt, float label) { return 1.0f; }
  static float ProbToMargin(float base_score) { return base_score; }
  static const char* LabelErrorMsg() { return ""; }
  static const char* DefaultEvalMetric() { return "rmse"; }
};
// logistic loss for probability regression task
struct LogisticRegression {
  static float PredTransform(float x) { return common::Sigmoid(x); }
  static bool CheckLabel(float x) { return x >= 0.0f && x <= 1.0f; }
  static float FirstOrderGradient(float predt, float label) { return predt - label; }
  static float SecondOrderGradient(float predt, float label) {
    const float eps = 1e-16f;
    return std::max(predt * (1.0f - predt), eps);
  } 

the authors mention this here https://github.com/dmlc/xgboost/tree/master/demo/regression

Upvotes: 7

Related Questions