Sajad Beheshti
Sajad Beheshti

Reputation: 689

Clustering using MLP on non-labeled dataset

How to use Multilayered Perceptron for clustering like K-Means on non-labeled dataset. I've MNIST dataset with labels but i was wanted to perform clustering algorithm with MLP. Any idea?

Upvotes: 0

Views: 777

Answers (1)

bones.felipe
bones.felipe

Reputation: 596

Edit: if the problem is restricted to use an MLP exclusively, I think you're looking for differentiable objectives for clustering. (K-Means objective is not differentiable because of the finding the centroids part). I think this is not a 'mainstream' approach to clustering, but certainly there seems to be some work to use deep networks to optimize clustering (differentiable) objectives:

  1. Differentiable Deep Clustering with Cluster Size Constraints : "we exploit the connection between optimal transport and k-means, and rely on entropic regularization to derive a fully-differentiable clustering loss that can be used in (P) and directly optimized with SGD". So you can apply SGD to an MLP, is an MLP the best architecture for using this loss? Depends on your data.

Another approach I could think of using ANNs is self-organizing maps (or Kohonen maps). It depends how relaxed is your definition of MLP, you can certainly add a bunch of layers between the input layer and the output feature maps.

You can potentially use a MLP to embed your data in to a vector space, which you can use to compute some metric during KMeans (eg Euclidean distance) which might or might not make sense, depending on how you compute the embeddings and the dataset.

You could do this with an Autoencoder in the absence of labels, though that is a bit more complex than a simple MLP:

enter image description here

This could be an overkill though, it really depends on the problem. Consider doing KMeans on your data first (no MLP). If the problem is complicated enough, moving the data to latent space could work, this is essentially what word2vec does and people do clustering and all sort of things with it (see this)

Upvotes: 1

Related Questions