Zimu Wang
Zimu Wang

Reputation: 67

Using MLP for Feature Extraction and Dimension Reduction

I'm trying to build a model that use MLP for feature extraction and dimension reduction. The model could transform the data from 204 dimensions to 80 dimensions after this process. The proposed model is as follows:

  1. A 512 dimension dense layer with the input of original data (204 dimension)
  2. A 256 dimension dense layer with the input of 512 dimensions
  3. A 80 dimension dense layer with the input of 256 dimensions

The proposed training epoch is 1, and the output of the MLP is regarded as the input of the further models (such as, LR, SVM, etc.)

My question is: When training the MLP, what loss function should I set? Is the MSE loss OK, or I should use other loss functions? Thanks!

Upvotes: 1

Views: 1139

Answers (1)

Galletti_Lance
Galletti_Lance

Reputation: 559

What would you be training this MLP on? (what would be the target 80-dimensional "Y"?)

MLPs are used to learn features at the same time as the model. For example if you wanted to have an MLP that does linear regression and learns a set of features that are 80-dimensional you could create something like this:

model = keras.models.Sequential()
model.add(layers.Dense(80, input_dim=512, activation=MY_ACTIVATION))
model.add(layers.Dense(1))
model.compile(loss="mean_squared_error")

In the last layer, the network will learn to find the "best" weights and biases to capture Y as a function of the 80 features extracted. These features are in turn a function of X - a function the network learns by adjusting for how well these features are able to capture Y (this is backpropagation).

So creating an MLP just to learn features doesn't make sense without a problem statement for what these features are supposed to do.

As such I would recommend using something like Principal Component Analysis or Singular Value Decomposition. These project the data onto the k-dimensional space that captures the most variance (information) in the data.

Upvotes: 0

Related Questions