Optimizer (machine learning)
- 2024-10-21
- 별칭: Optimizer
A specific implementation of the gradient descent algorithm. Popular optimizers include:1
- AdaGrad, which stands for ADAptive GRADient descent.
- Adam, which stands for ADAptive with Momentum.
A specific implementation of the gradient descent algorithm. Popular optimizers include:1