bazinga
bazinga

Reputation: 2250

Xgboost (GPU) crashing while predicting

I am using XGBoost GPU version in Python and it crashes whenever I try to run .predict. It works for a smaller data set but for my current problem it is not working.

train_final.shape, test_final.shape
((631761, 174), (421175, 174))


params = {
          'objective': 'multi:softmax', 
          'eval_metric': 'mlogloss',
          'eta': 0.1,
          'max_depth': 6, 
          'nthread': 4,
          'alpha':0,
          'num_class': 5,
          'random_state': 42, 
          'tree_method': 'gpu_hist',
          'silent': True
             }

GPU Stats: GTX 1070, 6 GB RAM: 32 GB

Could someone please help me understand why is this happening?

Upvotes: 2

Views: 1766

Answers (1)

BugKiller
BugKiller

Reputation: 1488

Saving the model, deleting the booster then loading the model again should achieve this.

# training
bst = xgb.train(param, dtrain, num_round)

#save model
joblib.dump(bst, 'xgb_model.dat')
bst.__del__()

#load saved model
bst = joblib.load('xgb_model.dat')
preds = bst.predict(dtest)

Upvotes: 4

Related Questions