Stochastic gradient descent

  • 2024-10-21
  • 별칭: SGD

A gradient descent algorithm in which the batch size is one. In other words, SGD trains on a single example chosen uniformly at random from a training set.1

Footnotes

  1. developers.google.com/machine-learning/glossary#SGD

2025 © ak