# Exploding gradient problem > The tendency for gradient in DNNs (especially RNNs) to become surprisingly steep (high). Steep gradients often cause very large updates to the weights of each neuron in a DNN.[^1] The tendency for [gradient](https://wiki.g15e.com/pages/Gradient%20(machine%20learning.txt)) in [DNNs](https://wiki.g15e.com/pages/Deep%20neural%20network.txt) (especially [RNNs](https://wiki.g15e.com/pages/Recurrent%20neural%20network.txt)) to become surprisingly steep (high). Steep gradients often cause very large updates to the [weights](https://wiki.g15e.com/pages/Weight%20(machine%20learning.txt)) of each [neuron](https://wiki.g15e.com/pages/Artificial%20neuron.txt) in a DNN.[^1] Models suffering from the exploding gradient problem become difficult or impossible to [train](https://wiki.g15e.com/pages/Training%20(machine%20learning.txt)). [Gradient clipping](https://wiki.g15e.com/pages/Gradient%20clipping.txt) can mitigate this problem.[^1] ## See also - [Vanishing gradient problem](https://wiki.g15e.com/pages/Vanishing%20gradient%20problem.txt) ## Footnotes [^1]: https://developers.google.com/machine-learning/glossary#exploding-gradient-problem