Reputation: 2561
I would like to keep the parameter alpha fixed at 1 and use random search for lambda, is this possible?
library(caret)
X <- iris[, 1:4]
Y <- iris[, 5]
fit_glmnet <- train(X, Y, method = "glmnet", tuneLength = 2, trControl = trainControl(search = "random"))
Upvotes: 1
Views: 1051
Reputation: 19756
I do not think this can be achieved by specifying directly in caret train
but here is how to emulate the desired behavior:
From this link
one can see random search for lambda is achieved by:
lambda = 2^runif(len, min = -10, 3)
where len
is the tune length
To emulate random search over one parameter:
len <- 2
fit_glmnet <- train(X, Y,
method = "glmnet",
tuneLength = len,
trControl = trainControl(search = "grid"),
tuneGrid = data.frame(alpha = 1, lambda = 2^runif(len, min = -10, 3)))
Upvotes: 4
Reputation: 50728
First off, I'm not sure you can use a random search and fix specific tuning parameters.
However, as an alternative you could use a grid search for optimising tuning parameters instead of a random search. You can then fix tuning parameters using tuneGrid
:
fit <- train(
X,
Y,
method = "glmnet",
tuneLength = 2,
trControl = trainControl(search = "grid"),
tuneGrid = data.frame(alpha = 1, lambda = 10^seq(-4, -1, by = 0.5)));
fit;
#glmnet
#
#150 samples
# 4 predictor
# 3 classes: 'setosa', 'versicolor', 'virginica'
#
#No pre-processing
#Resampling: Bootstrapped (25 reps)
#Summary of sample sizes: 150, 150, 150, 150, 150, 150, ...
#Resampling results across tuning parameters:
#
# lambda Accuracy Kappa
# 0.0001000000 0.9398036 0.9093246
# 0.0003162278 0.9560817 0.9336278
# 0.0010000000 0.9581838 0.9368050
# 0.0031622777 0.9589165 0.9379580
# 0.0100000000 0.9528997 0.9288533
# 0.0316227766 0.9477923 0.9212374
# 0.1000000000 0.9141015 0.8709753
#
#Tuning parameter 'alpha' was held constant at a value of 1
#Accuracy was used to select the optimal model using the largest value.
#The final values used for the model were alpha = 1 and lambda = 0.003162278.
Upvotes: 2