AppleGate0
AppleGate0

Reputation: 335

Tuning XGboost parameters In R

I am trying to tune parameters using the caret package in R but get a

Error in train.default(x = as.matrix(df_train %>% select(-c(Response,  : 
  The tuning parameter grid should have columns nrounds, lambda, alpha 

whenever I try to train the model, even though the columns nrounds, lambda, and alpha are there.

library(caret)
library(xgboost)
library(readr)
library(dplyr)
library(tidyr)

 xgb_grid_1 <- expand.grid(
  nrounds= 2400,
  eta=c(0.01,0.001,0.0001),
  lambda = 1,
  alpha =0
)

xgb_trcontrol <- trainControl(
  method="cv",
  number = 5,
  verboseIter = TRUE,
  returnData=FALSE,
  returnResamp = "all",
  allowParallel = TRUE,

)

xgb_train_1 <- train(
  x = as.matrix(df_train %>% select(-c(Response, Id))),
  y= df_train$Response,
 trControl = xgb_trcontrol,
 tuneGrid = xgb_grid_1,
 method="xgbLinear"
)

Upvotes: 8

Views: 16396

Answers (1)

phiver
phiver

Reputation: 23608

The problem lies in your xgb_grid_1. If you remove the line eta it will work.

Standard tuning options with xgboost and caret are "nrounds", "lambda" and "alpha". Not eta. use the modelLookup function to see which model parameters are available. If you want to use eta as well, you will have to create your own caret model to use this extra parameter in tuning as well.

modelLookup("xgbLinear")
      model parameter                 label forReg forClass probModel
1 xgbLinear   nrounds # Boosting Iterations   TRUE     TRUE      TRUE
2 xgbLinear    lambda     L2 Regularization   TRUE     TRUE      TRUE
3 xgbLinear     alpha     L2 Regularization   TRUE     TRUE      TRUE

Upvotes: 7

Related Questions