标签:
1. epoch
在代码中经常见到n_epochs这个参数,该参数到底是什么意思呢?答案如下:
在一个epoch中,所有训练集数据使用一次
one epoch = one forward pass and one backward pass of all the training examples
2. batch_size
一般情况下,一个训练集中会有大量的samples,为了提高训练速度,会将整个training set分为n_batch组,每组包含batch_size个samples
即:整个数据集samples个数 = batch_size * n_batch
batch size = the number of training examples in one forward/backward pass. The higher the batch size, the more memory space you‘ll need.
3. iterations
看到iteration和epoch这两个参数,很是困惑,总是分不清楚它们之间到底什么区别,这两个参数是完全不同的概念
每次iteration进行的工作为:利用某个batch的samples对model进行训练
number of iterations = number of passes, each pass using [batch size] number of examples. To be clear, one pass = one forward pass + one backward pass (we do not count the forward pass and backward pass as two different passes)
具体地
# epoch个数 n_epochs = 100 # 样本总个数 numSamples = 100 000 # 要将样本分割为n_batch组 n_batch = 10 # 每个batch包含的samples batch_size = numSamples / n_batch # 尅是进行训练 iterations = 0 for i in range(n_epochs ): for j in range (n_batch): #利用第j组batch进行training train (j) # iterations个数加1 iterations = iterations +1
可见:iterations = epoch * n_batch
即,每个epoch进行n_batch次training,每次training,利用batch_size个samples
标签:
原文地址:http://www.cnblogs.com/lutingting/p/5252589.html