Reputation: 5086
I am studying Perceptron Learning, and have a question which sort of leaves a bit confused. As I am self-teaching, I have looked through a variety of papers, tutorials, powerpoints etc., and at times it seems they use different algorithms to adjust the weights of the network.
For example, some include a learning rate, others include individual weight/input product while others just the sum of all weight/input products.
So, am I right in assuming that there are multiple algorithms which all lead to the same final weight matrix/vector?
Upvotes: 1
Views: 1793
Reputation: 12079
I wrote an article: An Intuitive Example of Artificial Neural Network (Perceptron) Detecting Vehicles and Pedestrians from the Camera of a Self-driven Car. I have tried to explain with simplest possible examples.
You can check, I hope that can help you to understand the weight updating in Perceptron. Here is the link.
https://www.spicelogic.com/Blog/Perceptron-Artificial-Neural-Networks-10
I also explained the Learning Rate with examples.
Upvotes: 1
Reputation: 3194
Nope, not the same.
You are right that there are many algorithms, but they may lead to different weights. Its like sorting algorithms - there are many, each of them does the same thing, but some are stable and some are not, some use additional memory, and some sorts in place.
Upvotes: 1