# Entropy (machine learning) > In information theory, a description of how unpredictable a probability distribution is. Alternatively, entropy is also defined as how much information each example contains. A distribution has the highest possible entropy when all values of a random variable are equally likely.[^1] In , a description of how unpredictable a [probability distribution](https://wiki.g15e.com/pages/Probability%20distribution.txt) is. Alternatively, entropy is also defined as how much information each [example](https://wiki.g15e.com/pages/Example%20(machine%20learning.txt)) contains. A distribution has the highest possible entropy when all values of a are equally likely.[^1] : 더 정리하기 ## Footnotes [^1]: https://developers.google.com/machine-learning/glossary#entropy