Aditya Das
Aditya Das

Reputation: 35

XGBoost Hyperparameter Tuning using Hyperopt

I am trying to tune my XGBClassifier model. But I am failing to do so. Please find the code below and please help me clean and edit the code.

    import csv
from hyperopt import STATUS_OK
from timeit import default_timer as timer
MAX_EVALS = 200
N_FOLDS = 10
def objective(params, n_folds = N_FOLDS):
    """Objective function for Gradient Boosting Machine Hyperparameter Optimization"""
    # Keep track of evals
    global ITERATION
    ITERATION += 1
    # Retrieve the subsample if present otherwise set to 1.0
    subsample = params['boosting_type'].get('subsample', 1.0)
    # Extract the boosting type
    params['boosting_type'] = params['boosting_type']['boosting_type']
    params['subsample'] = subsample
    # Make sure parameters that need to be integers are integers
    for parameter_name in ['num_leaves', 'subsample_for_bin', 
                          'min_child_samples']:
        params[parameter_name] = int(params[parameter_name])
    start = timer()
    # Perform n_folds cross validation
    cv_results = lgb.cv(params, train_set, num_boost_round = 10000, 
                       nfold = n_folds, early_stopping_rounds = 100, 
                       metrics = 'auc', seed = 50)
    run_time = timer() - start
    # Extract the best score
    best_score = np.max(cv_results['auc-mean'])
    # Loss must be minimized
    loss = 1 - best_score
    # Boosting rounds that returned the highest cv score
    n_estimators = int(np.argmax(cv_results['auc-mean']) + 1)
    # Write to the csv file ('a' means append)
    of_connection = open(out_file, 'a')
    writer = csv.writer(of_connection)
    writer.writerow([loss, params, ITERATION, n_estimators, 
                   run_time])
    # Dictionary with information for evaluation
    return {'loss': loss, 'params': params, 'iteration': ITERATION,
           'estimators': n_estimators, 'train_time': run_time, 
           'status': STATUS_OK}

I believe I am doing something wrong in the objective function, as I am trying to edit the objective function of LightGBM.

Please help me.

Upvotes: 0

Views: 1432

Answers (1)

erdogant
erdogant

Reputation: 1694

I created the hgboost library which provides XGBoost Hyperparameter Tuning using Hyperopt.

pip install hgboost

Examples can be found here

Upvotes: 0

Related Questions