Vinod Kumar
Vinod Kumar

Reputation: 1622

Keras Normalization for a 2d input array

I am new to machine learning and trying to apply it to my problem. I have a training dataset with 44000 rows of features with shape 6, 25. I want to build a sequential model. I was wondering if there is a way to use the features without flattening it. Currently, I flatten the features to 1d array and normalize for training (see the code below). I could not find a way to normalize 2d features.

dataset2d = dataset2d.reshape(dataset2d.shape[0],
                              dataset2d.shape[1]*dataset2d.shape[2])
normalizer = preprocessing.Normalization()
normalizer.adapt(dataset2d)
print(normalizer.mean.numpy())

x_train, x_test, y_train, y_test = train_test_split(dataset2d, flux_val,
                                                    test_size=0.2)

# %% DNN regression multiple parameter
def build_and_compile_model(norm):
    inputs = Input(shape=(x_test.shape[1],))
    x = norm(inputs)
    x = layers.Dense(128, activation="selu")(x)
    x = layers.Dense(64, activation="relu")(x)
    x = layers.Dense(32, activation="relu")(x)
    x = layers.Dense(1, activation="linear")(x)
    model = Model(inputs, x)
    model.compile(loss='mean_squared_error',
                  optimizer=keras.optimizers.Adam(learning_rate=1e-3))
    return model


dnn_model = build_and_compile_model(normalizer)
dnn_model.summary()
# interrupt training when model is no longer imporving
path_checkpoint = "model_checkpoint.h5"
modelckpt_callback = keras.callbacks.ModelCheckpoint(monitor="val_loss",
                                                     filepath=path_checkpoint,
                                                     verbose=1,
                                                     save_weights_only=True,
                                                     save_best_only=True)
es_callback = keras.callbacks.EarlyStopping(monitor="val_loss",
                                            min_delta=0, patience=10)
history = dnn_model.fit(x_train, y_train, validation_split=0.2,
                        epochs=120, callbacks=[es_callback, modelckpt_callback])

I also tried to modify my model input layer to the following, such that I do not need to reshape my input

inputs = Input(shape=(x_test.shape[-1], x_test.shape[-2], ))

and modify the normalization to the following

normalizer = preprocessing.Normalization(axis=1)
normalizer.adapt(dataset2d)
print(normalizer.mean.numpy())

But this does not seem to help. The normalization adapts to a 1d array of length 6, while I want it to adapt to a 2d array of shape 25, 6.

Sorry for the long question. You help will be much appreciated.

Upvotes: 0

Views: 990

Answers (1)

CrazyBrazilian
CrazyBrazilian

Reputation: 1070

I'm not sure if I understood your issue. The normalizer layer can take N-D tensor and it produces an output with the same shape, for example:

import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
import numpy as np

t = tf.constant(np.arange(2*3*4).reshape(2,3,4) , dtype=tf.float32)

tf.print("\n",t)

normalizer_layer = tf.keras.layers.LayerNormalization(axis=1)

output = normalizer_layer(t)

tf.print("\n",output)

Upvotes: 0

Related Questions