mineral
mineral

Reputation: 529

Warning occuring in xgboost

i'm solving the multi-classification problem using xgboost.

But, Warnings occured when fitting xgboost model.

My code is as follows. I'm using xgboost 1.4.0

start = time.time()


xgb_model = xgboost.XGBClassifier(tree_method='gpu_hist', eta = 0.2, nrounds= 1000, 
                                  colsample_bytree=0.5, 
                                  metric='multi:softmax') 


hr_pred = xgb_model.fit(x_train, np.ravel(y_train, order='C')).predict(x_test)

print(classification_report(y_test, hr_pred))

print(time.time()-start)

result comes out well. But this Warnings pops up.

Parameters: { "metric", "nrounds" } might not be used.

  This may not be accurate due to some parameters are only used in language bindings but
  passed down to XGBoost core.  Or some parameters are not used but slip through this
  verification. Please open an issue if you find above cases.


UserWarning: Use subset (sliced data) of np.ndarray is not recommended because it will generate extra copies and increase memory consumption
  "because it will generate extra copies and increase " +
  1. Even if it's not accurate, I don't know what it means to be passed down to the XGBoost core.
  2. Where did I use the subset of ndarray?

Upvotes: 9

Views: 7582

Answers (3)

Steve
Steve

Reputation: 1292

Insert this code snippet at the top of your script. The message is a regular expression matching the unwanted warning message.

import warnings
warnings.filterwarnings(action="ignore", message=r'.*Use subset.*of np.ndarray is not recommended')

Upvotes: 0

Niels Uitterdijk
Niels Uitterdijk

Reputation: 770

Personally I find the solution of downgrading rather risky.

Instead, you can suppress this particular warning quite easily with warnings.filterwarnings(action='ignore', category=UserWarning).

Unfortunately, according to the devs, this is expected behaviour. https://github.com/dmlc/xgboost/issues/6908

Upvotes: 4

mwhee
mwhee

Reputation: 682

I realize this is a temporary fix, but pip uninstall xgboost then pip install xgboost==1.3.3 worked for me.

Upvotes: 1

Related Questions