Daniel
Daniel

Reputation: 1448

How to get output of hidden layer given an input, weights and biases of the hidden layer in keras?

Suppose I have trained the model below for an epoch:

model = Sequential([
    Dense(32, input_dim=784), # first number is output_dim
    Activation('relu'),
    Dense(10), # output_dim, input_dim is taken for granted from above
    Activation('softmax'),
])

And I got the weights dense1_w, biases dense1_b of first hidden layer (named it dense1) and a single data sample sample.

How do I use these to get the output of dense1 on the sample in keras?

Thanks!

Upvotes: 10

Views: 16970

Answers (3)

sajed zarrinpour
sajed zarrinpour

Reputation: 1224

As for weights, I had a none-Sequential model. What I did was using model.summary() to get the desired layers name and then model.get_layer("layer_name").get_weights() to get the weights.

Upvotes: 0

Thomas Pinetz
Thomas Pinetz

Reputation: 7148

The easiest way is to use the keras backend. With the keras backend you can define a function that gives you the intermediate output of a keras model as defined here (https://keras.io/getting-started/faq/#how-can-i-obtain-the-output-of-an-intermediate-layer).

So in essence:

get_1st_layer_output = K.function([model.layers[0].input],
                                  [model.layers[1].output])
layer_output = get_1st_layer_output([X])

Upvotes: 19

Wilmar van Ommeren
Wilmar van Ommeren

Reputation: 7689

Just recreate the first part of the model up until the layer for which you would like the output (in your case only the first dense layer). Afterwards you can load the trained weights of the first part in your newly created model and compile it.

The output of the prediction with this new model will be the output of the layer (in your case the first dense layer).

from keras.models import Sequential
from keras.layers import Dense, Activation
import numpy as np

model = Sequential([
    Dense(32, input_dim=784), # first number is output_dim
    Activation('relu'),
    Dense(10), # output_dim, input_dim is taken for granted from above
    Activation('softmax'),
])
model.compile(optimizer='adam', loss='categorical_crossentropy')

#create some random data
n_features = 5
samples = np.random.randint(0, 10, 784*n_features).reshape(-1,784)
labels = np.arange(10*n_features).reshape(-1, 10)

#train your sample model
model.fit(samples, labels)

#create new model
new_model= Sequential([
    Dense(32, input_dim=784), # first number is output_dim
    Activation('relu')])

#set weights of the first layer
new_model.set_weights(model.layers[0].get_weights())

#compile it after setting the weights
new_model.compile(optimizer='adam', loss='categorical_crossentropy')

#get output of the first dens layer
output = new_model.predict(samples)

Upvotes: 10

Related Questions