Tim.G.
Tim.G.

Reputation: 299

Python MLPClassifier Value Error

I'm currently trying to train the MLPClassifier implemented in sklearn... When i try to train it with the given values i get this error:

ValueError: setting an array element with a sequence.

The format of the feature_vector is

[ [one_hot_encoded brandname], [different apps scaled to mean 0 and variance 1] ]

Does anybody know what I'm doing wrong ?

Thank you!




feature_vectors:

[

array([ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]),

array([ 0.82211852, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, 4.45590895, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, 0.3439882 , -0.22976818, -0.22976818, -0.22976818, 4.93403927, -0.22976818, -0.22976818, -0.22976818, 0.63086639, 1.10899671, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, 1.58712703, -0.22976818, 1.77837916, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, 2.16088342, -0.22976818, 2.16088342, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, 9.42846428, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, 0.91774459, -0.22976818, -0.22976818, 4.16903076, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, 2.44776161, -0.22976818, -0.22976818, -0.22976818, 1.96963129, 1.96963129, 1.96963129, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, 7.13343874, 5.98592598, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, 3.02151799, 4.26465682, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, 2.25650948, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, 1.30024884, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, 4.74278714, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, 0.3439882 , -0.22976818, 0.3439882 , -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, 0.53524033, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818, 3.49964831, -0.22976818, -0.22976818, -0.22976818, -0.22976818, -0.22976818])

]

g_a_group:

[ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0.]




MLP:

from sklearn.neural_network import MLPClassifier

clf = MLPClassifier(solver='lbfgs', alpha=1e-5, hidden_layer_sizes=(5, 2), random_state=1)

clf.fit(feature_vectors, g_a_group)

Upvotes: 0

Views: 563

Answers (1)

lejlot
lejlot

Reputation: 66775

Your data does not make any sense from scikit-learn perspective of what is expected in the .fit call. Feature vectors is supposed to be a matrix of size N x d, where N - number of data points and d number of features, and your second variable should hold labels, thus it should be vector of length N (or N x k where k is number of outputs/labels per point). Whatever is represented in your variables - their sizes do not match what they should represent.

Upvotes: 1

Related Questions