Murtaza
Murtaza

Reputation: 11

Model learning times in R

I am quite new to r and machine learning in general and am trying to train a few different models using some data. The training data set consists of 4650 observations of 75 variables, including the target. With dummy variables, I'm guessing(didn't preprocess myself) the total variables come up to around 130. The target is an ordered factor with 3 levels. With rattle, I get 1-2 seconds for training a tree, 45-55 seconds for training random forest, and around 30 seconds with svm. But using the nnet package with caret, I ran this code for a neuralnet model:

nnet_grid<-expand.grid(.decay = c(.1, .01), .size = c(20, 30, 40, 50))
nnetfit<-caret::train(OUTPUT ~ ., data=hdtrain, method="nnet", 
                      maxit = 10000, tuneGrid = nnet_grid, MaxNWts=10000)

Its now been more than 36 hours and this is still running. Is this sort of time expected? I'm running this on an i7-2720QM @ 2.2GHz with 8gb memory.

Upvotes: 1

Views: 199

Answers (1)

Tchotchke
Tchotchke

Reputation: 3121

You have both maxit and MaxNWts set quite high - is there a reason you chose those numbers? I'd suggest trying it with the defaults first, of 100 and 1000, respectively, and potentially also decreasing the size of your parameter grid. Then increase the iterations and number of parameters once you get a run of that through.

From the nnet documentation on MaxNWts:

The maximum allowable number of weights. There is no intrinsic limit in the code, but increasing MaxNWts will probably allow fits that are very slow and time-consuming. (emphasis mine)

Upvotes: 0

Related Questions