J.Weiser
J.Weiser

Reputation: 21

How to normalize a multiple input neural network?

I do have a question in regards on how to normalize and especially on how to denormalize neural networks with multiple inputs and only one output.

Do I need to normalize the Input variables independently from each other and then just use the scale of the variable I also want as an output to rescale my data.

For example: I have the input variables a and b.

a has a scale of 100-1000

b has a scale of 1-10

After normalization both variables are on a scale of 0-1.

My output data now needs to be the prediction for tomorrows a (a at t+1) and therefore again have a scale of 100-1000. Will I therefore simply denormalize according to the way I normalized a (inverse a's normalization? Or do I need to consider something else?

For normalizing both variables my code looks as follows:

from pandas import Series
from sklearn.preprocessing import MinMaxScaler

series1 = Series(df["a"])
series2 = Series(df["b"])

values1 = series1.values
values1 = values1.reshape((len(values1), 1))
values2 = series2.values
values2 = values2.reshape((len(values2), 1))

scaler1 = MinMaxScaler(feature_range=(0, 1))
scaler1 = scaler1.fit(values1)
scaler2 = MinMaxScaler(feature_range=(0, 1))
scaler2 = scaler2.fit(values2)

normalized1 = scaler1.transform(values1)
df["Normalized_a"] = normalized1
normalized2 = scaler2.transform(values2)
df["Normalized_b"] = normalized2

closesnorm1 = df["Normalized_a"]
closesnorm2 = df["Normalized_b"]

### Combine two variables into one NumPy array
normalizeddata = df[["Normalized_a","Normalized_b"]].values

Then I splitted the data:

### Split the data
X_train = []
y_train = []
for i in range (3, len(normalizeddata) - 3):
    y_train.append(normalizeddata[i,0])
    X_train.append(np.array((normalizeddata[i+1:i+4][::-1])))

X_train = np.array(X_train).reshape(-1,3,2)
y_train = np.array(y_train)

X_test = []
y_test = []
for i in range (0,3):
    y_test.append(normalizeddata[i,0])
    X_test.append(np.array((normalizeddata[i+1:i+4][::-1])))

X_test = np.array(X_test).reshape(-1,3,2)
y_test = np.array(y_test)

The model itself looks as follows taking two variables into consideration (see input shape of NumPy array):

model = Sequential()
model.add(LSTM(100,activation="relu", input_shape = (3, 2),  return_sequences = True))
model.add(Dropout(0.2))
model.add(LSTM(100,activation="relu", return_sequences = False))
model.add(Dropout(0.2))
model.add(LSTM(1,activation ="relu"))
model.compile(optimizer="adam", loss="mse")
model.fit(X_train, y_train, batch_size = 2, epochs = 10)

And last but not least I denormalized the output using Scaler1:

### Predicting y_test data
y_pred = model.predict(X_test)
y_pred = y_pred.reshape(-1)
df_pred = df[:3]
df_pred["a_predicted"] = scaler1.inverse_transform(y_pred.reshape(-1, 1))

Thanks a lot!

Upvotes: 0

Views: 2331

Answers (2)

Sreeram TP
Sreeram TP

Reputation: 11927

It will be better two use two scalers, say scaler a and scaler b.

Then scale the feature a with scaler a and b with scaler b. Then prepare the dataset using lagged features. If the feature b in the one you are forecasting, make prediction and inverse scale with scaler b.

Upvotes: 0

Igor F.
Igor F.

Reputation: 2699

That depends on the activation function in the output layer and on the target output you use for the training. As you seem to want to have the output of the same kind as one of the inputs, it seems natural to me to normalize the target output the same way you normalize a and, when you use the network for recall, use the inverse of the a's normalization.

However, consider editing your question to include some data and sample code. See How to create a Minimal, Complete, and Verifiable example.

Upvotes: 0

Related Questions