Edison
Edison

Reputation: 4281

sknn multi layer perceptron classifier

I am using the following neural net classifier in python

from sknn.mlp import Layer,Classifier

nn = mlp.Classifier(
            layers=[
                mlp.Layer("Tanh", units=n_feat/8),
                mlp.Layer("Sigmoid", units=n_feat/16),
                mlp.Layer("Softmax", units=n_targets)],
            n_iter=50,
            n_stable=10,
            batch_size=25,
            learning_rate=0.002,
            learning_rule="momentum",
            valid_size=0.1,
            verbose=1)

which is working just fine.My question is that how to proceed if I require for example 100,200 or 500 hidden layers? Do I have to specify each layer here manually or someone has better Idea in python for MLP?

Upvotes: 1

Views: 1737

Answers (1)

Aenimated1
Aenimated1

Reputation: 1624

You could create some loop-based mechanism to build the list of layers I suppose, but there's a bigger issue here. A standard MLP with hundreds of layers is likely to be extremely expensive to train - both in terms of computational speed as well as memory usage. MLPs typically only have one or two hidden layers, or occasionally a few more. But for problems that can truly benefit from more hidden layers, it becomes important to incorporate some of the lessons learned in the field of deep learning. For example, for object classification on images, using all fully-connected layers is incredibly inefficient, because you're interested in identifying spatially-local patterns, and therefore interactions between spatially-distant pixels or regions is largely noise. (This is a perfect case for using a deep convolutional neural net.)

Although some very deep networks have been created, it's worth pointing out that even Google's very powerful Inception-v3 model is only 42-layers deep. Anyway, if you're interested in building deep models, I'd recommend reading this Deep Learning book. From what I've read of it, it seems to be a very good introduction. Hope that helps!

Upvotes: 1

Related Questions