Reputation: 494
What does the term epochs mean in the neural network. How does it differ from pass and iteration
Upvotes: 3
Views: 1349
Reputation: 34
One round forward Propagation and backward Propagation into the neural network at once.(dataset )
One round of throwing the ball into the basket and finding out the error and come back and changing the weights.(f = ma)
The Process of initizing the mass and acceleration with random values and predicting the output is called the forward propagation.
Changing the values and again predicting the output .(By finding out the gradient)
If i change the input of X(Independent variable) then what the value of y(Dependent variable) is changed out is called the gradient.
Actually there is no answer for that . And the epochs are based on the dataset but you can say that the numbers of epochs is related to how different your data is. With an example, Do you have the only white tigers in your dataset or is it much more different dataset.
Iteration is the number of batches needed to complete one epoch.
We can divide the dataset of 1000 examples into batches of 250 then it will take 4 iterations to complete 1 epoch. (Here Batch size = 250, iteration = 4)
Upvotes: 1
Reputation: 22023
There are many neural networks algorithms in unsupervised learning. As long as a cost function can be defined, so can "neural networks" be used.
For instance, there are for instance autoencoders, for dimensionality reduction, or Generative Adversarial Networks (so 2 networks, one generating new samples). All these are unsupervised learning, and still using neural networks.
Upvotes: 2