Reputation: 11
i trained the following Xgboost classifier :
xgb.XGBClassifier(tree_method='hist', grow_policy='lossguide',gamma=1.0, max_depth=0, max_leaves=255, min_child_weight=100,n_estimators=500,
n_jobs=-1,
learning_rate=0.1,
subsample=0.7,
colsample_bytree=0.7,
)
I ploted the learning curves (log loss vs epoch) for the training set and the validation set.
The learning curves that i got are exactly the same for the training and the validation set: log loss decreases at the beginning from 0.6 to 0.3 and then plateau after 100 epochs
I believe that this is a case of underfitting ? giving that i have a high bias.
How can i complexify the model in this case?
Any help is very much appreciated
Thank you
Upvotes: 1
Views: 2397