CuriousGeorge
CuriousGeorge

Reputation: 43

Retrieve parameters from load model in xgboost

I have made a classification model, which has been saved using

bst.save_model('final_model.model')

in another file i load the model and do testing on my testdata using:

bst = xgb.Booster()  # init model
bst.load_model('final_model.model')  # load data
ypred = bst.predict(dtest)  # make prediction

Since I use kfold in my training process but need to use the whole test file for testing (so no kfold splitting) it is not possible for me to verify if I still get the exact same results as I should when loading the model in a new file. This made me curious as if there was a way to print my loaded models hyperparameters. After a lot of googling I found a way to do this in R with xgb.parameters(bst) or maybe also xgb.attr(bst) - but I have found no way to do this in Python. Since I do not use R I have not tested the above lines, but from documentation it seems to do what i need: output the hyperparameters in a loaded model. So can this be done in Python with xgboost?

EDIT: I can see that if i instead write ypred = bst.predict(dtest, ntree_limit=bst.best_iteration) i get the error 'Booster' object has no attribute 'best_iteration'. So it seems that the loaded model is not remembering all my hyperparameters. If i write bst.attributes() i can get it to output the number of the best iteration and it's eval score - but i don't see how to output the actual hyperparameters used.

Upvotes: 3

Views: 8292

Answers (1)

Marco Visibelli
Marco Visibelli

Reputation: 359

if you had used a xgboost.sklearn.XGBModel model You can then use the function get_xgb_params(), but there is no equivalent in the base xgboost.Booster class. Remember that a Booster is the BASE model of xgboost, that contains low level routines for training, prediction and evaluation. You can find more information here

Upvotes: 1

Related Questions