# Clipping (machine learning) > A technique for handling outliers by doing either or both of the following:[^1] A technique for handling [outliers](https://wiki.g15e.com/pages/Outliers.txt) by doing either or both of the following:[^1] - Reducing [feature](https://wiki.g15e.com/pages/Feature%20(machine%20learning.txt)) values that are greater than a maximum threshold down to that maximum threshold. - Increasing feature values that are less than a minimum threshold up to that minimum threshold. For example, suppose that <0.5% of values for a particular feature fall outside the range 40–60. In this case, you could do the following:[^1] - Clip all values over 60 (the maximum threshold) to be exactly 60. - Clip all values under 40 (the minimum threshold) to be exactly 40. Outliers can damage models, sometimes causing [weights](https://wiki.g15e.com/pages/Weight%20(machine%20learning.txt)) to overflow during training. Some outliers can also dramatically spoil metrics like [accuracy](https://wiki.g15e.com/pages/Accuracy%20(machine%20learning.txt)). Clipping is a common technique to limit the damage.[^1] [Gradient clipping](https://wiki.g15e.com/pages/Gradient%20clipping.txt) forces [gradient](https://wiki.g15e.com/pages/Gradient%20(machine%20learning.txt)) values within a designated range during [training](https://wiki.g15e.com/pages/Training%20(machine%20learning.txt)).[^1] ## Footnotes [^1]: https://developers.google.com/machine-learning/glossary#clipping