ScalaBoy
ScalaBoy

Reputation: 3392

How to show original feature names in the feature importance plot?

I created XGBoost model as follows:

y = XY.DELAY_MIN
X = standardized_df

train_X, test_X, train_y, test_y = train_test_split(X.as_matrix(), y.as_matrix(), test_size=0.25)

my_imputer = preprocessing.Imputer()
train_X = my_imputer.fit_transform(train_X)
test_X = my_imputer.transform(test_X)

xgb_model = XGBRegressor()

# Add silent=True to avoid printing out updates with each cycle
xgb_model = XGBRegressor(n_estimators=1000, learning_rate=0.05)
xgb_model.fit(train_X, train_y, early_stopping_rounds=5, 
             eval_set=[(test_X, test_y)], verbose=False)

When I create the feature importance plot, the feature names are shown as "f1", "f2", etc. How can I show original feature names?

fig, ax = plt.subplots(figsize=(12,18))
xgb.plot_importance(xgb_model, max_num_features=30, height=0.8, ax=ax)
plt.show()

Upvotes: 1

Views: 3417

Answers (1)

Mischa Lisovyi
Mischa Lisovyi

Reputation: 3223

The issue is the Imputer doesn't return a pd.DataFrame as an output of transform(), thus, your column names get lost, when you do

train_X = my_imputer.fit_transform(train_X)
test_X = my_imputer.transform(test_X)

Simple solution, wrap the imputer output into a dataframe, for example like this:

train_X = pd.DataFrame(my_imputer.fit_transform(train_X), columns=train_X.columns)
test_X  = pd.DataFrame(my_imputer.transform(test_X), columns=test_X.columns)

Upvotes: 3

Related Questions