Filippo Galli
Filippo Galli

Reputation: 61

Theano MLP with 2 hidden layers throws Shape Mismatch error

I'm approaching to neural networks implementations, trying to build a working MLP using Theano. Following the tutorial, I tried to enhance the net by adding a layer, for a total of two hidden layers each with the same amount of units (250). The problem is that when I run the script I meet "Shape mismatch" ValueError. My code is a modified version of the tutorial code that can be found here http://deeplearning.net/tutorial/mlp.html.

The part I modified is the snippet-2, namely the MLP object, as follows:

class MLP(object):

def __init__(self, rng, input, n_in, n_hidden, n_out):
    """Initialize the parameters for the multilayer perceptron

    :type rng: numpy.random.RandomState
    :param rng: a random number generator used to initialize weights

    :type input: theano.tensor.TensorType
    :param input: symbolic variable that describes the input of the
    architecture (one minibatch)

    :type n_in: int
    :param n_in: number of input units, the dimension of the space in
    which the datapoints lie

    :type n_hidden: int
    :param n_hidden: number of hidden units

    :type n_out: int
    :param n_out: number of output units, the dimension of the space in
    which the labels lie

    """

    self.hiddenLayer1 = HiddenLayer(
        rng=rng,
        input=input,
        n_in=n_in,
        n_out=n_hidden,
        activation=T.tanh
    )
    #try second hidden layer
    self.hiddenLayer2 = HiddenLayer(
        rng=rng,
        input=self.hiddenLayer1.output,
        n_in=n_in,
        n_out=n_hidden,
        activation=T.tanh
    )


    # The logistic regression layer gets as input the hidden units
    # of the hidden layer
    self.logRegressionLayer = LogisticRegression(
        input=self.hiddenLayer2.output,
        n_in=n_hidden,
        n_out=n_out
    )
    # end-snippet-2 start-snippet-3
    # L1 norm ; one regularization option is to enforce L1 norm to
    # be small
    self.L1 = (
        abs(self.hiddenLayer1.W).sum()
        + abs(self.hiddenLayer2.W).sum()
        + abs(self.logRegressionLayer.W).sum()
    )

    # square of L2 norm ; one regularization option is to enforce
    # square of L2 norm to be small
    self.L2_sqr = (
        (self.hiddenLayer1.W ** 2).sum()
        + (self.hiddenLayer2.W ** 2).sum()
        + (self.logRegressionLayer.W ** 2).sum()
    )

    # negative log likelihood of the MLP is given by the negative
    # log likelihood of the output of the model, computed in the
    # logistic regression layer
    self.negative_log_likelihood = (
        self.logRegressionLayer.negative_log_likelihood
    )
    # same holds for the function computing the number of errors
    self.errors = self.logRegressionLayer.errors

    # the parameters of the model are the parameters of the two layer it is
    # made out of
    self.params = self.hiddenLayer1.params + self.hiddenLayer2.params + self.logRegressionLayer.params
    # end-snippet-3

    # keep track of model input
    self.input = input

I also removed some comments for readability. The output error I get is:

ValueError: Shape mismatch: x has 250 cols (and 20 rows) but y has 784 rows (and 250 cols) Apply node that caused the error: Dot22(Elemwise{Composite{tanh((i0 + i1))}}[(0, 0)].0, W) Inputs types: [TensorType(float64, matrix), TensorType(float64, matrix)] Inputs shapes: [(20, 250), (784, 250)] Inputs strides: [(2000, 8), (2000, 8)] Inputs values: ['not shown', 'not shown']

Upvotes: 1

Views: 597

Answers (1)

Daniel Renshaw
Daniel Renshaw

Reputation: 34177

The size of the input to layer 2 needs to be the same size as the output from layer 1.

hiddenLayer2 takes hiddenLayer1 as input and hiddenLayer1.n_out == n_hidden but 'hiddenLayer2.n_in == n_in'. In this case n_hidden == 250 and n_in == 784. They should match but don't hence the error.

The solution is to make hiddenLayer2.n_in == hiddenLayer1.n_out.

Upvotes: 2

Related Questions