Reputation: 711
Is it possible to fit or approximate multidimensional functions with neural networks?
Let's say I want to model the function f(x,y)=sin(x)+y from some given measurement data. (f(x,y) is considered as ground truth and is not known). Also if it's possible some code examples written in Tensorflow or Keras would be great.
Upvotes: 0
Views: 2092
Reputation: 6220
As said by @AndreHolzner, theoretically you can approximate any continuous function with a neural network as well as you want, on any compact subset of R^n
, even with only one hidden layer.
However, in practice, the neural net can have to be very large for some functions, and sometimes be untrainable (the optimal weights may be hard to find without getting in a local minimum). So here are a few practical suggestions (unfortunately vague, because the details depend too much on your data and are hard to predict without multiple tries):
f
(like in your example, but it could be more complicated), you could add the sin()
function to some of of the outputs of the first layer (not all, that would give you a truly periodic output). If you suspect a polynom of degree n
, just augment you input x
with x²
, ...x^n
and use a linear regression on that input, etc. It will be much easier than learning the weights.R^n
, not on the entire multidimensional space. In particular, you'll never be able to predict the value for an input that's way bigger than any of the training samples for instance (say you trained on numbers from 0 to 100, don't test on 200, it will fail).For an example of regression you can look here for instance. To regress a more complicated function you'd need to put more complicated functions to get pred
from x
, for instance like this:
n_layers = 3
x = tf.placeholder(shape=[-1, n_dimensions], dtype=tf.float32)
last_layer = x
# Add n_layers dense hidden layers
for i in range(n_layers):
last_layer = tf.layers.dense(inputs=last_layer, units=128, activation=tf.nn.relu)
# Get the output prediction
pred = tf.layers.dense(inputs=last_layer, units=1, activation=None)
# Get the cost, training op, etc, just like in the linear regression example
Upvotes: 1