Reputation: 659
After I've used any type of normalization in training, how would that effect my prediction in the future?
Let's say a close price of a certain stock was normalized between 0 and 1, and I trained the model accordingly, and I reached a satisfying accuracy. Now when I make a prediction using new data, do I need to normalize the new data also? If so, I fit it to what? Now I don't have a train or test dataset, the new data might be higher than the fitted data (which would make it above 1) or lower than the lowest data (which would make it below 0).
I know normalization helps a lot during training, but what about post training when I want to predict real data?
Upvotes: 1
Views: 1476
Reputation: 4347
I think you're referring to rescaling the data. For this, you can add normalization layer to your model so it will automatically normalize input data so you don't need to do it as a separate step. For example, with TF you can do:
import numpy as np
import tensorflow as tf
from tensorflow.keras import Model
from tensofrlow.keras.layers import Dense, Input
from tensofrlow.keras.models import Sequential
from tensorflow.keras.layers.experimental.preprocessing import Rescaling
my_shape = tuple()
my_data: np.ndarray = #some image array
def model(my_shape):
input_data = Input(shape=my_shape)
x = Rescaling(scale=1./255)(input_data)
x = Dense(256, activation='relu')(x)
x = Dense(1, activation='softmax')(x)
return Model(input_data, x)
Here we are doing with scale 1./255 for image data. Now, if you train this model (well, not this one, as it won't do much) and then want to predict for new images, you simply load image as array and model will take care of rescaling for you.
If you also want to normalize to have a specific mean and specific variance, you can add Normalization
layer to preprocessing step.
Upvotes: 0