# Stochastic gradient descent > A gradient descent algorithm in which the batch size is one. In other words, SGD trains on a single example chosen uniformly at random from a training set.[^1] A [gradient descent](https://wiki.g15e.com/pages/Gradient%20descent.txt) algorithm in which the [batch size](https://wiki.g15e.com/pages/Batch%20size%20(machine%20learning.txt)) is one. In other words, SGD trains on a single [example](https://wiki.g15e.com/pages/Example%20(machine%20learning.txt)) chosen uniformly at random from a [training set](https://wiki.g15e.com/pages/Training%20data.txt).[^1] ## Footnotes [^1]: https://developers.google.com/machine-learning/glossary#SGD