# Optimizer (machine learning) > A specific implementation of the gradient descent algorithm. Popular optimizers include:[^1] A specific implementation of the [gradient descent](https://wiki.g15e.com/pages/Gradient%20descent.txt) algorithm. Popular optimizers include:[^1] - [AdaGrad](https://wiki.g15e.com/pages/AdaGrad.txt), which stands for ADAptive GRADient descent. - , which stands for ADAptive with Momentum. ## Footnotes [^1]: https://developers.google.com/machine-learning/glossary#optimizer