Reputation: 271
I need reliable substitute for sigmoid activation function. The problem with sigmoid function is that formatted output is between 0 and 1. I need activation function with output between 0 and 255. I'm training NN with backpropagation algorithm. If I will be using some other a function do I need to tweak learning algorithm ?
Upvotes: 0
Views: 522
Reputation: 13356
You can easily achieve that by multiplying the output by 255. That will move the data from the scale of 0 to 1 to the scale of 0 to 255.
If you change the activation function, you'll definitely need to change the calculations also. The back propagation algorithm uses gradient descent approach, so you'll need to incorporate the derivative of the activation function accordingly.
Upvotes: 1
Reputation: 3088
The most simple solution for you problem is to scale your data. Divide the outputs of your training set by 255 during training and when you are using your trained model you have to multiply the output of your neural network by 255. This way you don't have to change the gradient calculation.
Upvotes: 1