# Learning rate > A floating-point number that tells the gradient descent algorithm how strongly to adjust weights and biases on each iteration. For example, a learning rate of 0.3 would adjust weights and biases three times more powerfully than a learning rate of 0.1. A that tells the [gradient descent](https://wiki.g15e.com/pages/Gradient%20descent.txt) algorithm how strongly to adjust [weights](https://wiki.g15e.com/pages/Weight%20(machine%20learning.txt)) and [biases](https://wiki.g15e.com/pages/Bias%20(machine%20learning.txt)) on each [iteration](https://wiki.g15e.com/pages/Iteration%20(machine%20learning.txt)). For example, a learning rate of 0.3 would adjust weights and biases three times more powerfully than a learning rate of 0.1. Learning rate is a key [hyperparameter](https://wiki.g15e.com/pages/Hyperparameter.txt). If you set the learning rate too low, training will take too long. If you set the learning rate too high, gradient descent often has trouble reaching [convergence](https://wiki.g15e.com/pages/Convergence%20(machine%20learning.txt)).[^1] ## Footnotes [^1]: https://developers.google.com/machine-learning/glossary#learning-rate