NiMa
NiMa

Reputation: 173

GridSearchCV and XGBClassifier with eval_metric = 'mlogloss'

If it possible to use eval_metric = 'mlogloss' during search for XGBClassifier inside GridSearchCV ? Some example will be appreciated a lot.

Upvotes: 3

Views: 4632

Answers (1)

Anacleto
Anacleto

Reputation: 31

Yes, it's possible. You probably need to provide to GridSearchCV a score function that return the logloss (negative, the grid select the higher score models, and we want the lesser loss models) , and uses the model of the best iteration, as in:

from xgboost import XGBClassifier     
from sklearn.grid_search import GridSearchCV
from sklearn import metrics

tuned_parameters = {'learning_rate': [0.4,0.5],
        'max_depth': [6,7]
    }

fit_params={
    "eval_set":[(X_test_tr_boost, y_test)],
    "eval_metric": 'mlogloss',
    "early_stopping_rounds":100,
    "verbose":True
}

# XGBClassifier with early stopping Returns the model from the last iteration (not the best one).
# In order to provide to GridSearchCV the score of the best model, we need to use a score function
# to evaluate log_loss calling the estimator with the appropiate  ntree_limit param 
#(instead of using scoring=‘neg_log_loss’ in GridSearchCV creation)
#in order to use the best iteration of the estimator (ntree_limit)

def _score_func(estimator, X, y):
    score1 = metrics.log_loss(y,estimator.predict_proba(X,
                           ntree_limit=estimator.best_ntree_limit),
                          labels=[0, 1, 2, 3, 4, 5, 6, 7, 8])
    return -score1

model = XGBClassifier( objective ='multi:softprob',  seed=0,n_estimators=1000 )
gridsearch = GridSearchCV(model, tuned_parameters, verbose=999999 ,
    scoring=_score_func,
    fit_params=fit_params
    )
gridsearch.fit(X_train_tr_boost, y_train)

print (gridsearch.best_params_)
print (gridsearch.best_score_)

Upvotes: 3

Related Questions