jwin
jwin

Reputation: 21

get_all_param_values() how to read lasagne.layer

I am running Lasagne and Theano to create my Convolutional Neural Network. I currently consist of

l_shape = lasagne.layers.ReshapeLayer(l_in, (-1, 3,130, 130))
l_conv1 = lasagne.layers.Conv2DLayer(l_shape, num_filters=32, filter_size=3, pad=1)
l_conv1_1 = lasagne.layers.Conv2DLayer(l_conv1, num_filters=32, filter_size=3, pad=1)
l_pool1 = lasagne.layers.MaxPool2DLayer(l_conv1_1, 2)
l_conv2 = lasagne.layers.Conv2DLayer(l_pool1, num_filters=64, filter_size=3, pad=1)
l_conv2_2 = lasagne.layers.Conv2DLayer(l_conv2, num_filters=64, filter_size=3, pad=1)
l_pool2 = lasagne.layers.MaxPool2DLayer(l_conv2_2, 2)
l_conv3 = lasagne.layers.Conv2DLayer(l_pool2, num_filters=64, filter_size=3, pad=1)
l_conv3_2 = lasagne.layers.Conv2DLayer(l_conv3, num_filters=64, filter_size=3, pad=1)
l_pool3 = lasagne.layers.MaxPool2DLayer(l_conv3_2, 2)
l_conv4 = lasagne.layers.Conv2DLayer(l_pool3, num_filters=64, filter_size=3, pad=1)
l_conv4_2 = lasagne.layers.Conv2DLayer(l_conv4, num_filters=64, filter_size=3, pad=1)
l_pool4 = lasagne.layers.MaxPool2DLayer(l_conv4_2, 2)
l_conv5 = lasagne.layers.Conv2DLayer(l_pool4, num_filters=64, filter_size=3, pad=1)
l_conv5_2 = lasagne.layers.Conv2DLayer(l_conv5, num_filters=64, filter_size=3, pad=1)
l_pool5 = lasagne.layers.MaxPool2DLayer(l_conv5_2, 2)
l_out = lasagne.layers.DenseLayer(l_pool5, num_units=2, nonlinearity=lasagne.nonlinearities.softmax)

My last layer is a denselayer which uses a softmax to output my classification. My ultimate goal is to retrieve the probability and not the classification (0 or 1).

When I call get_all_param_values(), it provides me an extensive array. I only want the weights and bias for the last dense layer. How do you go about this? I have tried l_out.W and l_out.b and get_values().

Thanks in advance!

Upvotes: 2

Views: 1572

Answers (2)

Indie AI
Indie AI

Reputation: 601

I modified your code because what you pasted references an l_in, but you don't include an l_in in your code. I defined the following network:

l_shape = lasagne.layers.InputLayer(shape = (None, 3, 130, 130))
l_conv1 = lasagne.layers.Conv2DLayer(l_shape, num_filters=32, filter_size=3, pad=1)
l_conv1_1 = lasagne.layers.Conv2DLayer(l_conv1, num_filters=32, filter_size=3, pad=1)
l_pool1 = lasagne.layers.MaxPool2DLayer(l_conv1_1, 2)
l_conv2 = lasagne.layers.Conv2DLayer(l_pool1, num_filters=64, filter_size=3, pad=1)
l_conv2_2 = lasagne.layers.Conv2DLayer(l_conv2, num_filters=64, filter_size=3, pad=1)
l_pool2 = lasagne.layers.MaxPool2DLayer(l_conv2_2, 2)
l_conv3 = lasagne.layers.Conv2DLayer(l_pool2, num_filters=64, filter_size=3, pad=1)
l_conv3_2 = lasagne.layers.Conv2DLayer(l_conv3, num_filters=64, filter_size=3, pad=1)
l_pool3 = lasagne.layers.MaxPool2DLayer(l_conv3_2, 2)
l_conv4 = lasagne.layers.Conv2DLayer(l_pool3, num_filters=64, filter_size=3, pad=1)
l_conv4_2 = lasagne.layers.Conv2DLayer(l_conv4, num_filters=64, filter_size=3, pad=1)
l_pool4 = lasagne.layers.MaxPool2DLayer(l_conv4_2, 2)
l_conv5 = lasagne.layers.Conv2DLayer(l_pool4, num_filters=64, filter_size=3, pad=1)
l_conv5_2 = lasagne.layers.Conv2DLayer(l_conv5, num_filters=64, filter_size=3, pad=1)
l_pool5 = lasagne.layers.MaxPool2DLayer(l_conv5_2, 2)
l_out = lasagne.layers.DenseLayer(l_pool5, num_units=2, nonlinearity=lasagne.nonlinearities.softmax)

Just to implement Daniel Renshaw's answer:

params = l_out.get_params()
W = params[0].get_value()

When you print params, you will see all the parameters for l_out:

[W, b] 

So each element of params, params[0] and params[1] is a Theano shared variable and you can get the numerical values by params[i].get_value().

Upvotes: 1

Daniel Renshaw
Daniel Renshaw

Reputation: 34187

You can get the parameters for a single layer using get_params. This is explained in the documentation.

Upvotes: 1

Related Questions