Exploding gradient problem

The tendency for gradient in DNNs (especially RNNs) to become surprisingly steep (high). Steep gradients often cause very large updates to the weights of each neuron in a DNN.1

Models suffering from the exploding gradient problem become difficult or impossible to train. Gradient clipping can mitigate this problem.1

See also

Footnotes

  1. developers.google.com/machine-learning/glossary#exploding-gradient-problem 2

2024 © ak