Mohammad Amin
Mohammad Amin

Reputation: 454

How to initialize biases in a Keras model?

I am trying to build a synthetic model in Keras, and I need to assign values for the weights and biases. Assigning the weights is easy, I am using the instructions provided here: https://keras.io/initializations/. However, I could not find any instructions on how to assign the biases. Any ideas?

Upvotes: 22

Views: 42461

Answers (4)

Trevor Witter
Trevor Witter

Reputation: 126

Weight and bias initialization for each layer can be set via kernel_initializer and bias_initializer keyword arguments respectively within layers.Dense(). If undefined by user, default settings of kernel_initializer='glorot_uniform' and bias_initializer='zeros' are applied.

For example, if you wanted to initialize a layer's weight initialization to random uniform instead of glorot and bias initialization to 0.1 instead of 0, you could define a given layer as follows:

from keras import layers, initializers

layer = layers.Dense(64,
                     activation='relu',
                     kernel_initializer='random_uniform',
                     bias_initializer=initializers.Constant(0.1))(previous_layer)

See layers/core/ for details on Dense layer keyword arguments and initializers/ for preset and customizable initializer options

Upvotes: 6

StatsSorceress
StatsSorceress

Reputation: 3099

You can also use bias_initializer like this:

model.add(Dense(64,
                kernel_initializer='random_uniform',
                bias_initializer='zeros')

This is from https://keras.io/initializers/

Upvotes: 21

Hengda Qi
Hengda Qi

Reputation: 334

You can find the answer here. https://keras.io/layers/core/

weights: list of Numpy arrays to set as initial weights. The list should have 2 elements, of shape (input_dim, output_dim) and (output_dim,) for weights and biases respectively.

When adding a new layer, you can define the argument "weights", a list that contains initial w and b with shape speicified.

model.add(Dense(50, input_dim= X_train.shape[1], weights = [np.zeros([692, 50]), np.zeros(50)]))

Upvotes: 19

xiaoming-qxm
xiaoming-qxm

Reputation: 1828

Initialize biases with small positive value such as 0.1

Since we're using ReLU neurons, it is also good practice to initialize them with a slightly positive initial bias to avoid "dead neurons".

Upvotes: 6

Related Questions