Fagui Curtain
Fagui Curtain

Reputation: 1917

XGBoost on python : what is wrong with xgb.cv?

I'm trying to use xgboost on python. Here is my code. xgb.train works but I get an error with xgb.cv, although it seems I used it the correct way.

The following works for me:

###### XGBOOST ######

import datetime
startTime = datetime.datetime.now() 

import xgboost as xgb
data_train   = np.array(traindata.drop('Category',axis=1))
labels_train = np.array(traindata['Category'].cat.codes)

data_valid   = np.array(validdata.drop('Category',axis=1))
labels_valid = np.array(validdata['Category'].astype('category').cat.codes)

weights_train = np.ones(len(labels_train))
weights_valid  = np.ones(len(labels_valid ))

dtrain = xgb.DMatrix( data_train, label=labels_train,weight = weights_train)
dvalid  = xgb.DMatrix( data_valid , label=labels_valid ,weight = weights_valid )




param = {'bst:max_depth':5, 'bst:eta':0.05, # eta [default=0.3]
         #'min_child_weight':1,'gamma':0,'subsample':1,'colsample_bytree':1,'scale_pos_weight':0, # default
         # max_delta_step:0 # default
         'min_child_weight':5,'scale_pos_weight':0, 'max_delta_step':2,
         'subsample':0.8,'colsample_bytree':0.8,
         'silent':1, 'objective':'multi:softprob' }


param['nthread'] = 4
param['eval_metric'] = 'mlogloss'
param['lambda'] = 2
param['num_class']=39

evallist  = [(dtrain,'train'),(dvalid,'eval')] # if there is a validation set
# evallist  = [(dtrain,'train')]                   # if there is no validation set

plst = param.items()
plst += [('ams@0','eval_metric')]

num_round = 100

bst = xgb.train( plst, dtrain, num_round, evallist,early_stopping_rounds=5 ) # early_stopping_rounds=10 # when there is a validation set

# bst.res=xgb.cv(plst,dtrain,num_round,nfold = 5,evallist,early_stopping_rounds=5)

bst.save_model('0001.model')

# dump model
bst.dump_model('dump.raw.txt')
# dump model with feature map
# bst.dump_model('dump.raw.txt','featmap.txt')

x = datetime.datetime.now() - startTime
print(x)

But if I change the line...

bst = xgb.train( plst, dtrain, num_round, evallist,early_stopping_rounds=5 ) 

...to this one...

bst.res = xgb.cv(plst,dtrain,num_round,nfold = 5,evallist,early_stopping_rounds=5)

...I get the following unexpected error:

File "", line 45 bst.res=xgb.cv(plst,dtrain,num_round,nfold = 5,evallist,early_stopping_rounds=5) SyntaxError: non-keyword arg after keyword arg

EDIT1: I tried changing the order of keywords as well:

bst.res = xgb.cv(plst,dtrain,num_round,evallist,nfold = 5,early_stopping_rounds=5) 

...and I get the following error:

--------------------------------------------------------------------------- 
TypeError                                 
Traceback (most recent call last) <ipython-input-49-36177ef64bab> in <module>()
      43 # bst = xgb.train( plst, dtrain, num_round, evallist,early_stopping_rounds=5 ) # early_stopping_rounds=10 # when   there is a validation set
      44 
 ---> 45 bst.res=xgb.cv(plst,dtrain,num_round,evallist,nfold =5 ,early_stopping_rounds=5)
      46 
      47 bst.save_model('0001.model')

 TypeError: cv() got multiple values for keyword argument 'nfold'

EDIT2 After all, there is no need in CV for a validation set. there is no argument evals in the signature of xgb.cv (although it is present for xgb.train) so I removed it and change the line to:

bst.res=xgb.cv(params=plst,dtrain=dtrain,num_boost_round=num_round,nfold = 5,early_stopping_rounds=5)

then i get this error

/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/xgboost/training.pyc
in cv(params, dtrain, num_boost_round, nfold, metrics, obj, feval,
maximize, early_stopping_rounds, fpreproc, as_pandas, show_progress,
show_stdv, seed)
    413     best_score_i = 0
    414     results = []
--> 415     cvfolds = mknfold(dtrain, nfold, params, seed, metrics, fpreproc)
    416     for i in range(num_boost_round):
    417         for fold in cvfolds:  
/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/xgboost/training.pyc
in mknfold(dall, nfold, param, seed, evals, fpreproc)
    280         else:
    281             tparam = param
--> 282         plst = list(tparam.items()) + [('eval_metric', itm) for itm in evals]
    283         ret.append(CVPack(dtrain, dtest, plst))
    284     return ret
AttributeError: 'list' object has no attribute 'items'

Upvotes: 2

Views: 5668

Answers (1)

Matthew Drury
Matthew Drury

Reputation: 1095

Here is the signature of xgboost.cv, copied from the documentation

xgboost.cv(params, dtrain, num_boost_round=10, nfold=3, stratified=False,
    folds=None, metrics=(), obj=None, feval=None, maximize=False,
    early_stopping_rounds=None, fpreproc=None, as_pandas=True,
    verbose_eval=None, show_stdv=True, seed=0, callbacks=None)

Notice that there are exactly two strictly positional parameters (params, dtrain), and the parameter in the fourth position is nfold.

Your call is:

xgb.cv(plst, dtrain, num_round, evallist, nfold=5, early_stopping_rounds=5) 

When python parses a function call, it first matches all the arguments you passed positionally by position. So in your case, python matches like this

Formal Parameter <-- What You Passed In
          params <-- plst
          dtrain <-- dtrain
 num_boost_round <-- num_round
           nfold <-- evallist

Then python matches all the arguments you passed in as keywords by name. So in your case, python matches like this

Formal Parameter <-- What You Passed In
          nfold <-- 5
          early_stopping_rounds <-- 5

So you can see that the formal parameter nfold gets assigned twice, which is what is generating this

TypeError: cv() got multiple values for keyword argument 'nfold'

Probably the easiest and clearest fix is to pass all your arguments as keywords. Generally it is a best practice to limit your positional arguments to a very small number, most programmers seem to aim for about two positional parameters, at most.

but im getting another error, i can't figure it out alas

Looks like you're passing a list where a dictionary is expected. Using the docs again, the first argument:

params (dict) – Booster params.

Should be a dictionary.

Upvotes: 9

Related Questions