# Regularization rate > A number that specifies the relative importance of regularization during training. Raising the regularization rate reduces overfitting but may reduce the model's predictive power. Conversely, reducing or omitting the regularization rate increases overfitting.[^1] A number that specifies the relative importance of [regularization](https://wiki.g15e.com/pages/Regularization%20(machine%20learning.txt)) during [training](https://wiki.g15e.com/pages/Training%20(machine%20learning.txt)). Raising the regularization rate reduces [overfitting](https://wiki.g15e.com/pages/Overfitting.txt) but may reduce the [model](https://wiki.g15e.com/pages/Model%20(machine%20learning.txt))'s predictive power. Conversely, reducing or omitting the regularization rate increases overfitting.[^1] The regularization rate is usually represented as $\lambda$. The following simplified loss equation shows lambda's influence:[^1] $$ minimize(loss function + \lambda(regularization)) $$ where *regularization* is any regularization mechanism, including: - [L1 regularization](https://wiki.g15e.com/pages/L1%20regularization.txt) - [L2 regularization](https://wiki.g15e.com/pages/L2%20regularization.txt) ## Footnotes [^1]: https://developers.google.com/machine-learning/glossary#regularization_rate