Reputation: 105067
How can I set Neural Networks so they accept and output a continuous range of values instead of a discrete ones? From what I recall from doing a Neural Network class a couple of years ago, the activation function would be a sigmoid, which yields a value between 0 and 1. If I want my neural network to yield a real valued scalar, what should I do? I thought maybe if I wanted a value between 0 and 10 I could just multiply the value by 10? What if I have negative values? Is this what people usually do or is there any other way? What about the input?
Thanks
Upvotes: 35
Views: 25286
Reputation: 35914
Much of the work in the field of neuroevolution involves using neural networks with continuous inputs and outputs.
There are several common approaches:
(source: natekohl.net)
(source: natekohl.net)
Upvotes: 33
Reputation: 75125
There are no rules which require the output ( * ) to be any particular function. In fact we typically need to add some arithmetic operations at the end of the function per-se implemented in a given node, in order to scale and otherwise coerce the output to a particular form.
The advantage of working with all-or-nothing outputs and/or 0.0 to 1.0 normalized output is that it makes things more easily tractable, and also avoid issues of overflowing and such.
( * ) "Output" can be understood here as either the ouptut a given node (neuron) within the network or that of the network as a whole.
As indicated by Mark Bessey the input [to the network as a whole] and the output [of the network] typically receive some filtering/conversion. As hinted in this response and in Mark's comment, it may be preferable to have normalized/standard nodes in the "hidden" layers of the network, and apply some normalization/conversion/discretization as required for the input and/or for the output of the network; Such practice is however only a matter of practicality rather than an imperative requirement of Neural Networks in general.
Upvotes: 7
Reputation: 19782
You will typically need to do some filtering (level conversion, etc) on both the input and the output. Obviously, filtering the input will change the internal state, so some consideration needs to be given to not losing the signal you're trying to train on.
Upvotes: 3