Reputation: 163
Regarding neural networks, why do we need weights and biases?
For weights I've had this intuition that we're trying to multiply certain constants to the inputs such that we could reach the value of y
and know the relation, kinda' like y = mx + c
. Please help me out with the intuition as well if possible.
Upvotes: -2
Views: 1213
Reputation: 970
I'd like to credit this answer to Jed Fox from this site where I have adapted his explanation. It's a great intro to neural Networks!:
https://github.com/cazala/synaptic/wiki/Neural-Networks-101
Adapted answer:
Neurons in a network are based on neurons found in nature. They take information in and, according to that information, will illicit a certain response. An "activation".
Artificial neurons look like this:
Artificial Neuron
As you can see they have several inputs, for each input there's a weight (the weight of that specific connection). When the artificial neuron activates, it computes its state, by adding all the incoming inputs multiplied by its corresponding connection weight. But neurons always have one extra input, the bias, which is always 1, and has its own connection weight. This makes sure that even when all the inputs are none (all 0s) there's gonna be an activation in the neuron.
After computing its state, the neuron passes it through its activation function, which normalises the result (normally between 0-1).
These weights (and sometimes biases) are what we learn in a neural network. Think of them as parameters of the system. Without them they'd be pretty useless!
Additional comment: In a network, those weighted inputs may come from other neurons so you can begin to see that the weights also describe how neurons are related to each other, often signifying the importance of the relationship between 2 neurons.
I hope this helps. There is plenty more information available on the internet and the link above. Consider reading through some of Stanford's Material for CNNs for information on more complicated neural networks.
Upvotes: 4