chris4953
chris4953

Reputation: 1

how are important weights defined in a neural network?

So I have weights of a pretrained neural network but I'm kinda lost as to what each of the numbers mean. At all the neurons and at every layer of a netowrk, what do negative weights and positive weights mean? Does a weight that's away from 0 mean that it's very important?

Upvotes: 0

Views: 480

Answers (1)

K0mp0t
K0mp0t

Reputation: 99

First of all, are you sure that you need to understand those numbers? Large CNNs and RNNs may have millions of parameters.

The answer:

  1. Sign of weight means almost noting, it's like a coefficient in an equation.
  2. The absolute value of weight - distance from zero, though, means a lot. Large-abs weights produce strong output (which may be a sign of an overfitting)

Upvotes: 0

Related Questions