Reputation: 11
I need to start my net weights as it doesn't predict well and takes a long time to train
this is my code:
for train_index, test_index in kf.split(X):
X_train, Y_train = X[train_index],Y[train_index]
X_test, Y_test = X[test_index],Y[test_index]
model = Sequential()
model.add(Dense(units=4, activation='sigmoid', input_dim=4))
model.add(Dense(units=16, activation='linear'))
model.add(Dense(units=1, activation='linear'))
model.compile(loss='mse', optimizer='adamax')
model.fit(X_train, Y_train, batch_size=4, epochs=1200,
validation_data= (X_test, Y_test) ,verbose=1)
Upvotes: 1
Views: 2896
Reputation: 1562
You can use one of the Keras initializers. For example, the following code uses Random Normal Initializer:
initializer = tf.keras.initializers.RandomNormal(mean=0., stddev=1.)
layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)
If you want to initialize every layer with it, your code should look like this:
for train_index, test_index in kf.split(X):
initializer = tf.keras.initializers.RandomNormal(mean=0., stddev=1.)
X_train, Y_train = X[train_index],Y[train_index]
X_test, Y_test = X[test_index],Y[test_index]
model = Sequential()
model.add(Dense(units=4, activation='sigmoid', input_dim=4, , kernel_initializer=initializer))
model.add(Dense(units=16, activation='linear', , kernel_initializer=initializer))
model.add(Dense(units=1, activation='linear', , kernel_initializer=initializer))
model.compile(loss='mse', optimizer='adamax')
model.fit(X_train, Y_train, batch_size=4, epochs=1200,
validation_data= (X_test, Y_test) ,verbose=1)
Upvotes: 1