# mini-batch > A small, randomly selected subset of a batch processed in one iteration. The batch size of a mini-batch is usually between 10 and 1,000 examples.[^1] A small, randomly selected subset of a [batch](https://wiki.g15e.com/pages/Batch%20(machine%20learning.txt)) processed in one [iteration](https://wiki.g15e.com/pages/Iteration%20(machine%20learning.txt)). The [batch size](https://wiki.g15e.com/pages/Batch%20size%20(machine%20learning.txt)) of a mini-batch is usually between 10 and 1,000 examples.[^1] For example, suppose the entire training set (the full batch) consists of 1,000 examples. Further suppose that you set the batch size of each mini-batch to 20. Therefore, each iteration determines the loss on a random 20 of the 1,000 examples and then adjusts the [weights](https://wiki.g15e.com/pages/Weight%20(machine%20learning.txt)) and [biases](https://wiki.g15e.com/pages/Bias%20(machine%20learning.txt)) accordingly.[^1] It is much more efficient to calculate the [loss](https://wiki.g15e.com/pages/Loss%20(machine%20learning.txt)) on a mini-batch than the loss on all the examples in the full batch.[^1] ## Footnotes [^1]: https://developers.google.com/machine-learning/glossary#mini-batch