# Backpropagation > The algorithm that implements gradient descent in neural networks.[^1] The algorithm that implements [gradient descent](https://wiki.g15e.com/pages/Gradient%20descent.txt) in [neural networks](https://wiki.g15e.com/pages/Artificial%20neural%20network.txt).[^1] Training a neural network involves many [iterations](https://wiki.g15e.com/pages/Iteration%20(machine%20learning.txt)) of the following two-pass cycle:[^1] 1. During the **forward pass**, the system processes a [batch](https://wiki.g15e.com/pages/Batch%20(machine%20learning.txt)) of [examples](https://wiki.g15e.com/pages/Example%20(machine%20learning.txt)) to yield prediction(s). The system compares each prediction to each [label](https://wiki.g15e.com/pages/Label%20(machine%20learning.txt)) value. The difference between the prediction and the label value is the [loss](https://wiki.g15e.com/pages/Loss%20(machine%20learning.txt)) for that example. The system aggregates the losses for all the examples to compute the total loss for the current batch. 2. During the **backward pass** (backpropagation), the system reduces loss by adjusting the weights of all the [neurons](https://wiki.g15e.com/pages/Artificial%20neuron.txt) in all the [hidden layer(s)](https://wiki.g15e.com/pages/Hidden%20layer.txt). Neural networks often contain many neurons across many hidden layers. Each of those neurons contribute to the overall loss in different ways. Backpropagation determines whether to increase or decrease the weights applied to particular neurons.[^1] The [learning rate](https://wiki.g15e.com/pages/Learning%20rate.txt) is a multiplier that controls the degree to which each backward pass increases or decreases each weight. A large learning rate will increase or decrease each weight more than a small learning rate.[^1] In terms, backpropagation implements the . That is, backpropagation calculates the [partial derivative](https://wiki.g15e.com/pages/Partial%20derivative.txt) of the error with respect to each parameter.[^1] ## Footnotes [^1]: https://developers.google.com/machine-learning/glossary#backpropagation