# Batch size (machine learning) > The number of examples in a batch. For instance, if the batch size is 100, then the model processes 100 examples per iteration. The number of [examples](https://wiki.g15e.com/pages/Example%20(machine%20learning.txt)) in a [batch](https://wiki.g15e.com/pages/Batch%20(machine%20learning.txt)). For instance, if the batch size is 100, then the [model](https://wiki.g15e.com/pages/Model%20(machine%20learning.txt)) processes 100 examples per [iteration](https://wiki.g15e.com/pages/Iteration%20(machine%20learning.txt)). The following are popular batch size strategies: - [Stochastic gradient descent](https://wiki.g15e.com/pages/Stochastic%20gradient%20descent.txt), in which the batch size is 1. - Full batch, in which the batch size is the number of examples in the entire [training set](https://wiki.g15e.com/pages/Training%20data.txt). For instance, if the training set contains a million examples, then the batch size would be a million examples. Full batch is usually an inefficient strategy. - stochastic gradient descent, in which the batch size is usually between 10 and 1000. Mini-batch is usually the most efficient strategy.