Reputation: 313
I'm trying to generate prediction using a trained backpropagation neural network using the neuralnet package on a new data set. I used the 'compute' function but end up with the same value for all observations. What did I do wrong?
# the data
Var1 <- runif(50, 0, 100)
sqrt.data <- data.frame(Var1, Sqrt=sqrt(Var1))
# training the model
backnet = neuralnet(Sqrt~Var1, sqrt.data, hidden=2, err.fct="sse", linear.output=FALSE, algorithm="backprop", learningrate=0.01)
print (backnet)
Call: neuralnet(formula = Sqrt ~ Var1, data = sqrt.data, hidden = 2, learningrate = 0.01, algorithm = "backprop", err.fct = "sse", linear.output = FALSE)
1 repetition was calculated.
Error Reached Threshold Steps
1 883.0038185 0.009998448226 5001
valnet = compute(backnet, (1:10)^2)
summary (valnet$net.result)
V1
Min. :0.9998572
1st Qu.:0.9999620
Median :0.9999626
Mean :0.9999505
3rd Qu.:0.9999626
Max. :0.9999626
print (valnet$net.result)
[,1]
[1,] 0.9998572272
[2,] 0.9999477241
[3,] 0.9999617930
[4,] 0.9999625684
[5,] 0.9999625831
[6,] 0.9999625831
[7,] 0.9999625831
[8,] 0.9999625831
[9,] 0.9999625831
[10,] 0.9999625831
Upvotes: 2
Views: 6115
Reputation: 1045
I was able to get the following to work:
library(neuralnet)
# the data
Var1 <- runif(50, 0, 100)
sqrt.data <- data.frame(Var1, Sqrt=sqrt(Var1))
# training the model
backnet = neuralnet(Sqrt~Var1, sqrt.data, hidden=10, learningrate=0.01)
print (backnet)
Var2<-c(1:10)^2
valnet = compute(backnet, Var2)
print (valnet$net.result)
Returns:
[,1]
[1,] 0.9341689395
[2,] 1.9992711472
[3,] 3.0012823496
[4,] 3.9968226732
[5,] 5.0038316976
[6,] 5.9992936957
[7,] 6.9991576925
[8,] 7.9996871591
[9,] 9.0000849977
[10,] 9.9891334545
According to the neuralnet reference manual, the default training algo for the package is backpropogation:
neuralnet is used to train neural networks using backpropagation, resilient backpropagation (RPROP) with (Riedmiller, 1994) or without weight backtracking (Riedmiller and Braun, 1993) or the modified globally convergent version (GRPROP) by Anastasiadis et al. (2005). The function allows flexible settings through custom-choice of error and activation function. Furthermore the calculation of generalized weights (Intrator O. and Intrator N., 1993) is implemented.
Upvotes: 2