StepLR of learning rate

Adjust the learning rate StepLR at equal intervals, and adjust the learning rate to lr*gamma

Adjust the learning rate at equal intervals, the adjustment multiple is gamma times, and the adjustment interval is step_size. The interval unit is step. It should be noted that step usually refers to epoch and should not be made into iteration.

torch.optim.lr_scheduler.StepLR(optimizer, step_size, gamma=0.1, last_epoch=-1)
parameter setting:
step_size(int)- the number of learning rate drop intervals, if it is 30, it will be in 30, 60, 90...steps , Adjust the learning rate to lr*gamma.
gamma(float)- the learning rate adjustment multiplier, the default is 0.1 times, that is, a decrease of 10 times.
last_epoch(int)-The last epoch number, this variable is used to indicate whether the learning rate needs to be adjusted. When last_epoch meets the set interval, the learning rate will be adjusted. When it is -1, the learning rate is set to the initial value.

For example:

# Assuming optimizer uses lr = 0.05 for all groups
# lr = 0.05     if epoch < 30
# lr = 0.005    if 30 <= epoch < 60
# lr = 0.0005   if 60 <= epoch < 90
# ...
scheduler = StepLR(optimizer, step_size=30, gamma=0.1)
for epoch in range(100):
    train(...)
    validate(...)
    scheduler.step()
   

Reprinted from: https://www.cnblogs.com/xym4869/p/11654611.html

Guess you like

Origin blog.csdn.net/qq_35037684/article/details/113312478