Georgie Ji
Georgie Ji

Reputation: 15

How to implement naive batch gradient descent?

everyone. I have a question about the implement of gradient descent. I have found several optimizers, like ada_grad, adam, sgd and so on, they're perfect. But I'm attempting to implement the naive gradient method batch gradient descent that is with the fix learning rate and acts on the whole examples in each batch. How to do it? Wait for your help. Thanks very much.

Upvotes: 0

Views: 106

Answers (1)

Yuya Unno
Yuya Unno

Reputation: 16

How about using large enough batch-size and SGD? It is equivalent to simple Gradient Descent method.

Upvotes: 0

Related Questions