logankilpatrick
logankilpatrick

Reputation: 14521

How to set the Learning Rate of an optimizer in Flux.jl

I would like to set the default learning rate for my optimizer in Flux. I was looking at this example: https://fluxml.ai/Flux.jl/stable/training/optimisers/ and it appears that the interface to do so is through the update! function. Is this the way to set the Learning rate or are their other options as well?

Upvotes: 1

Views: 404

Answers (1)

logankilpatrick
logankilpatrick

Reputation: 14521

As mentioned in the Flux.jl docs, there are a few different interfaces to set the learning rate. For an optimizer, you can use the update!() function. In the case of gradient descent:

Descent(η = 0.1): Classic gradient descent optimiser with learning rate η. For each parameter p and its gradient δp, this runs p -= η*δp

which means we can pass in some learning rate (usually between 0.1 and 0.001) to the Descent function to set the LR.

There are a bunch of other functions you can use to specify the LR for specific optimizer use cases and you can find those here: https://fluxml.ai/Flux.jl/stable/training/optimisers/#Optimiser-Reference

Upvotes: 2

Related Questions