Reputation: 21
I have designed a neural network that has two hidden layers with different activation functions. How can I set the activation function for each layer different from the other layers using sklearn.neural_network.MLPClassifier library?
is there something similar to this?
from sklearn.neural_network import MLPClassifier
clf = MLPClassifier(alpha=1e-5 ,hidden_layer_sizes=(10,5),activation=['tanh','relu'])
the error was: the error was:" raise ValueError("The activation '%s' is not supported)"
Upvotes: 2
Views: 2321
Reputation: 4453
From the documentation, the activitation can be one of:
activation{‘identity’, ‘logistic’, ‘tanh’, ‘relu’}, default=’relu’
Activation function for the hidden layer.
‘identity’, no-op activation, useful to implement linear bottleneck, returns f(x) = x
‘logistic’, the logistic sigmoid function, returns f(x) = 1 / (1 + exp(-x)).
‘tanh’, the hyperbolic tan function, returns f(x) = tanh(x).
‘relu’, the rectified linear unit function, returns f(x) = max(0, x)
There is no option to set different activations for different layers. And recall that a MLP is conceptually more simple than a full-fledged neural network. If you want a simple architecture, then why not just use the same activation for both layers? If you want more control, then switch to a real deep learning framework.
Upvotes: 2