Reputation: 501
I am testing hyperoptim for parameter tuning of XGboost. I am kind of replicating the code from here- https://www.kaggle.com/eikedehling/tune-and-compare-xgb-lightgbm-rf-with-hyperopt
I am using Python 3 and getting the following error for the code snippet given below. any idea how to resolve this?
def objective(params):
params = {
'max_depth': int(params['max_depth']),
'gamma': "{:.3f}".format(params['gamma']),
'colsample_bytree': '{:.3f}'.format(params['colsample_bytree']),
}
clf = xgb.XGBClassifier(
n_estimators=50,
learning_rate=0.1,
n_jobs=4,
**params
)
score = cross_val_score(clf, train_X, train_Y, scoring=gini_scorer, cv=StratifiedKFold()).mean()
print("Gini {:.3f} params {}".format(score, params))
return score
space = {
'max_depth': hp.quniform('max_depth', 2, 8, 1),
'colsample_bytree': hp.uniform('colsample_bytree', 0.3, 1.0),
'gamma': hp.uniform('gamma', 0.0, 0.5),
}
best = fmin(fn=objective,
space=space,
algo=tpe.suggest,
max_evals=10)
Traceback (most recent call last): File "", line 4, in File "/anaconda3/envs/py27/lib/python2.7/site-packages/hyperopt/fmin.py", line 314, in fmin pass_expr_memo_ctrl=pass_expr_memo_ctrl) File "/anaconda3/envs/py27/lib/python2.7/site-packages/hyperopt/base.py", line 786, in init pyll.toposort(self.expr) File "/anaconda3/envs/py27/lib/python2.7/site-packages/hyperopt/pyll/base.py", line 715, in toposort assert order[-1] == expr
TypeError: 'generator' object has no attribute 'getitem'
Upvotes: 1
Views: 760
Reputation: 501
Resolved! The issue is incompatibility of Hyperopt with networkxx2. One needs to downgrade to "networkxx1.11".
Upvotes: 1