Batch size (machine learning)

Batch size is a term used in machine learning and refers to the number of training examples utilized in one iteration. The batch size can be one of three options:

  • batch mode: where the batch size is equal to the total dataset thus making the iteration and epoch values equivalent
  • mini-batch mode: where the batch size is greater than one but less than the total dataset size. Usually, a number that can be divided into the total dataset size.
  • stochastic mode: where the batch size is equal to one. Therefore the gradient and the neural network parameters are updated after each sample.