Reputation: 37
I want a perceptron to tell me if the stock price tomorrow will be up or down, but weights get really high or low, like around 1000 or -1000. Is that normal or do I need to modify the input data so it is in a range?
The code https://colab.research.google.com/drive/1XeP5UjtZVq2eKozmoiFfGy_8Zyg8X5Y7?usp=sharing
Upvotes: 0
Views: 37
Reputation: 71
There is an issue with the gradient descent in your code. From your method "makeGuess" I conclude your network to calculate
with A being the activation function and A(y) being the output of your network to given input vector (x1, x2, x3, ...). As activation function you use a specific step function while you try to use gradient descent as optimization method. The problem is that with a step function it doesn't make any sense to use gradient descent. For more background on this please read: https://stats.stackexchange.com/questions/271701/why-is-step-function-not-used-in-activation-functions-in-machine-learning
In your method "train" you try to use gradient descent which you use in the wrong way because here you ignore the activation function in the forward path. Think about using the tanh as non-linear activation. Then you get output values in the interval (-1, 1). Then you can also apply gradient descent if you use it correctly.
Moreover you should think about a better weight initialization. E.g., Keras uses glorot_uniform initialization as default which is a great option to start with. Maybe you want to look in this topic, too. Right now you are using uniform(-1,1) which is not the best idea.
If you do all this you can think about scaling your input data. You could use MinMaxScaler from package sklearn and try different values for min and max (e.g. -1 and 1) and see where you get the best results.
Upvotes: 1