Ginopinoshow
Ginopinoshow

Reputation: 91

How to impose weight symmetries in Lasagne

I'm working with Lasagne on a neural network.

I want that on the first layer, the same weight is applied to many neurons of the input layer (and obviously I want that weights update considering the contribute of all these neurons)

This because my input has many symmetries: I have 24*n different inputs but I want only 4*n different weights (n is a parameter that I still need to decide)

How can I do it?

Upvotes: 0

Views: 79

Answers (1)

sygi
sygi

Reputation: 4647

Use the theano.shared variables, e.g.:

l_in = lasagne.layers.InputLayer(10)
weights = theano.shared(np.zeros(10, 100))
layer_1 = lasagne.layers.DenseLayer(l_in, num_units=100, W=weights)
layer_2 = lasagne.layers.DenseLayer(l_in, num_units=100, W=weights)
layer_3 = lasagne.layers.DenseLayer(l_in, num_units=100, b=layer_2.b)

this way layer_1 will have the same weights as layer_2 and layer_2 will have the same biases as layer_3.

http://lasagne.readthedocs.io/en/latest/user/layers.html#parameter-sharing

Upvotes: 0

Related Questions