Aleksk89
Aleksk89

Reputation: 93

Keras forward pass with dropout

I am trying to use dropout to get error estimates for a neural network.

This involves running several forward passes of my network after training, with dropout activated. Dropout does not seem to be activated when calling model.predict() however. Can this be done in Keras or must i take my weights elsewhere?

Upvotes: 2

Views: 1466

Answers (2)

Vadim Smolyakov
Vadim Smolyakov

Reputation: 1197

Stochastic forward passes (that use dropout during test-time) can be implemented using keras backend function. Assuming you have a trained neural network called model:

from keras import backend as K

nb_MC_samples = 100
MC_output = K.function([model.layers[0].input, K.learning_phase()], [model.layers[-1].output])

learning_phase = True  # use dropout at test time
MC_samples = [MC_output([x_test, learning_phase])[0] for _ in xrange(nb_MC_samples)]
MC_samples = np.array(MC_samples)  # [#samples x batch size x #classes]

For complete implementation see the following ipython notebook.

Upvotes: 1

Lukasz Tracewski
Lukasz Tracewski

Reputation: 11377

It is already done in Keras, see e.g. this discussion on the project's page. More on how it works can be found for instance in CS231n: Convolutional Neural Networks for Visual Recognition - AFAIK a very similar implementation is in Keras. Specifically:

(...) Crucially, note that in the predict function we are not dropping anymore, but we are performing a scaling of both hidden layer outputs by pp. This is important because at test time all neurons see all their inputs, so we want the outputs of neurons at test time to be identical to their expected outputs at training time.

Upvotes: 0

Related Questions