kenneth
kenneth

Reputation: 143

Why is the neural network not learning the curve?

For the experimental purpose, I use tf.keras to build a neural network with one neuron attached to sigmoid. The target curve to learn is:

#target function
f = lambda x:  - 1./(np.exp(10.*x)+1.)

I sampled a few points from the curve for training data.

#creat training data

x_train = np.linspace(-1, 1, 111)
y_train = f(x_train)


#test data

x_test = np.linspace(-1, 1, 11)
y_test = f(x_test)

The model is as below:

model = tf.keras.models.Sequential([
  tf.keras.layers.Dense(1, activation='sigmoid', input_shape=(1,), use_bias=True)
])

model.compile(optimizer=tf.keras.optimizers.Adam(0.01),
              loss='mse',
              metrics=['MeanAbsoluteError'])

But it does not learn the curve. Test code is

x_test = np.linspace(-1, 1, 11)
plt.plot(x_test, f(x_test), label='true')
y_pred = model.predict(x_test)
plt.plot(x_test, y_pred, label='predict')
plt.legend()
plt.show()

enter image description here

The code is shared by colab, see

https://colab.research.google.com/drive/1LQ9MXjrMxsImc80o6wMk1oKfeadnNaG3

There shall be obvious mistake, anybody can help?

Upvotes: 1

Views: 93

Answers (1)

nneonneo
nneonneo

Reputation: 179422

The sigmoid activation function can only output values between 0 and 1. Since all the values of f(x) are negative, the function cannot be learned.

One way to handle this is to simply normalize the values to [0, 1]. In your case, simply learning f = lambda x: 1./(np.exp(10.*x)+1.) works fine.

Upvotes: 2

Related Questions