Reputation: 1
I have a backpropagation neural network that I have created and coded it in Q with a Kdb+ database.
I am pre-processing data into the network with normalization into the form of [0,1], the network is trained on and predicts future moving averages on a large data set split into 60:20:20 respectively.
Normalization formula:
processed data: (0.8*(VALn - MINn)/(MAXn - MINn))+0.1
VALn = unprocessed data value
MAXn = max of data set
MINn = min of data set
How do I go about normalizing new data into the final trained network?
Would I run new inputs through the above formula keeping the MIN and MAX values from the training set?
Thanks
Upvotes: 0
Views: 907
Reputation: 626
You should keep the same MAXn and MINn, since changing it at test time would mean that you are changing how raw data is mapped to processed data. For a quick check, try the preprocessing with different MAXn and MINn, then try to predict the training cases. You will get lower performance since the normalized data do not look as they did before.
Note that if you happen to have data in test set that is higher/lower than MAXn/MINn, then those data will not be in range [0,1] after normalization. This is generally okay if there is not too many of these cases. It simply means the neural net is seeing data that is little out of previously seen range.
Upvotes: 1
Reputation: 10850
Yes, for prediction you should use the same normalization formula, as you use for normalization of training data.
Upvotes: 0