Kento Nishi
Kento Nishi

Reputation: 598

TensorFlow Dense Layers: 1 dimentional weights?

I have my network set up in the following fashion:

model = keras.Sequential([
    keras.layers.Flatten(input_shape=(28, 28)),
    keras.layers.Dense(128, activation='relu'),
    keras.layers.Dense(10, activation='softmax')
])

I would expect this configuration to be like this:

[784 neurons]
(784,128 weights)
[128 neurons]
(128,10 weights)
[10 neurons]

But, when I print the network's weights with model.get_weights(), it produces the following output:

for w in model.get_weights():
    print(w.shape,"\n")

(784, 128)

(128,)

(128, 10)

(10,)

Why do (128,) and (10,) exist in this model?

Upvotes: 1

Views: 72

Answers (1)

giser_yugang
giser_yugang

Reputation: 6166

(784, 128) and (128, 10) are the last two layers weights. (128,) and (10,) are the last two layers biases. If you don't need biases, you can use use_bias parameter to set it. For example:

import keras

model = keras.Sequential([
    keras.layers.Flatten(input_shape=(28, 28)),
    keras.layers.Dense(128, use_bias=False,activation='relu'),
    keras.layers.Dense(10, use_bias=False,activation='softmax')
])

for w in model.get_weights():
    print(w.shape,"\n")

# print
(784, 128) 

(128, 10) 

Upvotes: 1

Related Questions