pytorch---之MultiStepLR

classtorch.optim.lr_scheduler.MultiStepLR(optimizermilestonesgamma=0.1last_epoch=-1)

>>> # Assuming optimizer uses lr = 0.05 for all groups
>>> # lr = 0.05     if epoch < 30
>>> # lr = 0.005    if 30 <= epoch < 80
>>> # lr = 0.0005   if epoch >= 80

转载:https://www.cnblogs.com/z1141000271/p/9417473.html

猜你喜欢

转载自blog.csdn.net/zxyhhjs2017/article/details/90059539