Eye depth of PyTorch training camp

A the six learning rate adjustment strategy PyTorch

class_LRScheduler

The main attributes:

  • optimizer: The optimizer associated
  • last_epoch: Record number of epoch
  • base_lrs: record the initial learning rate

Main methods:

  • step (): Updates the next epoch of learning rate
  • get_lr (): virtual function, calculates the next epoch of learning rate

Learning rate adjustment

1、StepLR

Function: learning rate adjustment intervals

The main parameters:

  • step_size: adjustment interval
  • gamma: adjustment factor

Adjustment mode: lr = lr * gamma

2、MultiStepLR

Function: at given intervals to adjust the learning rate

The main parameters:

milestones: the number of time adjustment setting

gamma: adjustment factor

3、ExponetialLR

Function: exponential decay rate adjustment learning

The main parameters:

gamma: the end of the index

Adjustment mode: lr = lr * gamma ** epoch

4、CosineAnnealingLR

Function: Cosine learning rate adjustment cycle

The main parameters:

T_max: down-cycle

eta_min: learning rate decrease

Adjustment:

 

 5、ReduceLRonPlateau

Function: monitoring index, when the index did not change the adjustment

The main parameters:

mode: min / max modes

factor: adjustment factor

patience: "patience", not to accept a few changes

cooldown: "cooling time" to stop monitoring for some time

verbose: whether to print the log

min_lr: learning rate limit

eps: the minimum learning rate decay

6、LambdaLR

Function: biasing policy

The main parameters:

lr_lambda:function or list

 

summary:

1, orderly adjustment: Step, MultiStep, Exponential and CosineAnnealing

2, adaptive adjustment: ReduceLROnPleateau

3, custom adjustments: Lambda

 

Learning rate initialization:

1. Set a smaller number: 0.01,0.001,0.0001

2, search for the maximum learning rate:

Guess you like

Origin www.cnblogs.com/cola-1998/p/11899493.html