Reputation: 1
Is supervised training of a neural network with 2 unknown outputs possible where there is a relation such as y=a.x^b between known parameters (y,x) and unknowns (a,b). here (a,b) are the outputs of network!!!
Upvotes: 0
Views: 136
Reputation: 66815
The direct consequence of the universal approximation theorem is that any continous function from the compact subset of R^d
onto k
-dimensional hypercube can be approximated with standard feed forward neural network with given error bound eps
.
So in simple words - in fact every function can be trained using neural network, which does not mean that in practise any algorithm will actually do this (it is purely existantional proof, which gives no intuition "where to look").
So if your question is "is it possible to train a network that will aproximate my function?" the answer is yes, if the question is "is it possible to make neural network represent exactly my function" then the answer is yes, but given a custom activation function.
Upvotes: 1