Aditya
Aditya

Reputation: 2520

Neural Networks (backpropagation)

Suppose we have trained a neural network. My question is will that same neural network generate the data back if we apply what previously was the output as in present as the input?

I was working on the MNIST dataset and wondered what will happen if we train​ our network from the output side(using the final output as the input from that side itself) using Backpropagation algorithm.

My thinking says that it can get the data back (or approximations to the original dataset). Can it be justified?

Upvotes: 1

Views: 168

Answers (1)

Thomas Wagenaar
Thomas Wagenaar

Reputation: 6779

As far as I know. It can't. Especially because activation functions are (mostly) non-linear.

A neural network is a black box (see this answer). Second of all, take f(x) = x^2. If you want to compute n from f(n), then there are two possible solutions; the same works for neural networks, there can be multiple solutions, so it's impossible to inverse all of them. But the main point being: just because you know the inverse of a function, doesn't mean you know the inverse of a neural-network. It's a black-box!

However, you can visualise what response a neuron gives with a certain input. For example, this are the 'aspects' a neural network looks for to recognize a face:

enter image description here

Google Deepdream also amplifies the aspects it's looking for to recognize certain objects. Check it out!

Upvotes: 1

Related Questions