Reputation: 67
I am hyperparameter tuning a random forest and I would like to tune the parameter regarding the maximum features of each tree. By sklearn's documentation it is:
The number of features to consider when looking for the best split: If int, then consider max_features features at each split.
If float, then max_features is a percentage and int(max_features * n_features) features are considered at each split.
If “auto”, then max_features=sqrt(n_features).
If “sqrt”, then max_features=sqrt(n_features) (same as “auto”).
If “log2”, then max_features=log2(n_features).
If None, then max_features=n_features.
I tried looking through the h2o documentation to no avail.
Does this parameter or any of the different ways you can adjust that parameter (e.g. log of features) exist in h2o?
Upvotes: 1
Views: 273