【DL】深度学习术语汇编(Deep Learning Terminology)

epoch : 所有训练集使用一次,叫一个 epoch

one epoch = one forward pass and one backward pass of all the training examples

对应代码中的参数是 n_epochs.

batch_size : 一个 batch 中 samples 的个数

一般情况下,一个训练集中会有大量的samples,受限于内存大小通常无法一次加载,同时为了提高训练速度,会将整个training set分为n_batch组,每组包含batch_size个samples

train_set = batch_size * n_batch

batch size = the number of training examples in one forward/backward pass. The higher the batch size, the more memory space you’ll need.

iterations : 利用某个 batch 中的所有samples 进行一次训练,叫一次 iteration

number of iterations = number of passes, each pass using [batch size] number of examples. To be clear, one pass = one forward pass + one backward pass (we do not count the forward pass and backward pass as two different passes)
n_iterations = n_epoch * n_batch

具体流程是

# epoch个数
n_epochs = 100
# 样本总个数
numSamples = 100 000
# 要将样本分割为n_batch组
n_batch = 10
# 每个batch包含的samples
batch_size = numSamples / n_batch 
# 进行训练
iterations = 0
for i in range(n_epochs ):
    for j in range (n_batch):
       #利用第j组batch进行training
       train (j) 
       # iterations个数加1
       iterations = iterations  +1
LCN : Local Contrast Normalization

more

weight decay: 权值衰减, 防止过拟合

more

momentum : 动量,SGD中常用的加速方法

more


Ref

猜你喜欢

转载自blog.csdn.net/baishuo8/article/details/81941289