Epoch (machine learning)

A full training pass over the entire training set such that each example has been processed once. An epoch represents N/batch size training iterations, where N is the total number of examples.1

For instance, suppose the following:

  • The dataset consists of 1,000 examples.
  • The batch size is 50 examples.

Therefore, a single epoch requires 20 iterations. (1 epoch = 1,000 / 50 = 20)

Number of epochs

The number of epochs is a hyperparameter you set before the model begins training. In many cases, you’ll need to experiment with how many epochs it takes for the model to converge. In general, more epochs produces a better model, but also takes more time to train.2

Footnotes

  1. developers.google.com/machine-learning/glossary#epoch

  2. https://developers.google.com/machine-learning/crash-course/linear-regression/hyperparameters

2024 © ak