Reputation: 941
I do not get the idea of iterations in machine learning. Since programming is deterministic (as in do the same thing each time it is ran), how can accuracy improve by running the same algorithm over and over again?
Does this iteration concept have anything to do with RNN(Recurrent neural network) - in a sense that they feedback to the same neutrons more times - models or SGD(Stochastic Gradient Descent) -where they take same samples of training data for efficiency- ?
Thanks
Edit: What I mean by Iterations is in this toy app I found on this site: https://blog.kovalevskyi.com/rnn-based-chatbot-for-6-hours-b847d2d92c43
What the author did was that he used RNN to create a chatbot. What I fail to understand is how does increasing the number of iterations increase the accuracy of the prediction since the same algorithm is being ran each time.
But from @Spirit_Dongdong's post, it seems that my understanding of iterations (as in what is done in each iteration might be wrong), hence, I am trying to clarify what is meant by an iteration and what is done in an iteration.
Upvotes: 3
Views: 4215
Reputation: 56357
What the article talks about is training iterations. This is because when training neural networks, we use an iterative algorithm, typically stochastic gradient descent.
This is done to solve an optimization problem, minimize a function, but we don't know the values of the parameter that minimize that function. Then we use the gradient as information in the direction which to move the parameters, but it doesn't tell us how much to move, so we move a fixed amount (the step size or learning rate).
Then considering all of this, if we do one step of moving into the gradient direction, we are closer to the solution (the minimum), but we just moved a little, then we move a little more, and so on. These are iterations. When you increase the number of iterations, you get closer to the minima and the set of optimal parameters, and these are what improve performance.
You are right that the same algorithm is being run, but not with the same inputs. The inputs keep changing, as the current value of the parameters changes after one iterations.
Upvotes: 3
Reputation: 141
most of the machine learning problems are optimization problems. such as to minimize the loss function or maximize likelihood function. There are different methods of optimization, some like OLE(Ordinary Least Squares) don't need iteration, while some others like SGD or newton's find a direction to optimize and do iterate it
Upvotes: 0