DvdV
DvdV

Reputation: 77

Tuning hyperparameters Inception model with checkpoints

I have a question concerning tuning hyperparameters for the Inception ResNet V2 model (or any other DL model), which I can't really wrap my head around. Right now, I have certain set certain hyperparameters, such as learning_rate, decay_factor and decay_after_nr_epochs. My model saves checkpoints, so it can continue at these points later on. If I run the model again, with more epochs, it logically continues at the last checkpoint to continue training.

However, if I would set new hyperparameters, such as learning_rate = 0.0001 instead of learning_rate = 0.0002, does it make sense to continue on the checkpoints, or is it better to use new hyperparameters on the initial model?

The latter sounds more logical to me, but I'm not sure whether this is necessary.

Thanks in advance.

Upvotes: 0

Views: 761

Answers (1)

talos1904
talos1904

Reputation: 982

Both the methods are okay but you have to see your training loss after adjusting them. If they are converging in both the cases then it's fine otherwise adjust accordingly.

However, people adopt these two methods as far as I know 1. Keep a higher learning rate initially and keep a decay factor, thus reducing your learning rate slowly as it starts converging. 2. You can keep an eye on loss function and do early stopping if you think you can adjust to better learning rate.

Upvotes: 1

Related Questions