Exploding gradient problem
The tendency for gradient in DNNs (especially RNNs) to become surprisingly steep (high). Steep gradients often cause very large updates to the weights of each neuron in a DNN.1
Models suffering from the exploding gradient problem become difficult or impossible to train. Gradient clipping can mitigate this problem.1