# Dropout regularization > A form of regularization useful in training neural networks. Dropout regularization removes a random selection of a fixed number of the units in a network layer for a single gradient step. The more units dropped out, the stronger the regularization. This is analogous to training the network to emulate an exponentially large ensemble of smaller networks.[^1] A form of [regularization](https://wiki.g15e.com/pages/Regularization%20(machine%20learning.txt)) useful in training [neural networks](https://wiki.g15e.com/pages/Artificial%20neural%20network.txt). Dropout regularization removes a random selection of a fixed number of the units in a network layer for a single gradient step. The more units dropped out, the stronger the regularization. This is analogous to [training](https://wiki.g15e.com/pages/Training%20(machine%20learning.txt)) the network to emulate an exponentially large ensemble of smaller networks.[^1] For full details, see Dropout: [A simple way to prevent neural networks from overfitting](https://wiki.g15e.com/pages/A%20simple%20way%20to%20prevent%20neural%20networks%20from%20overfitting.txt). ## Footnotes [^1]: https://developers.google.com/machine-learning/glossary#dropout_regularization