# Rectified linear unit > An activation function with the following behavior: An [activation function](https://wiki.g15e.com/pages/Activation%20function.txt) with the following behavior: - If input is negative or zero, then the output is 0. - If input is positive, then the output is equal to the input. For example: - If the input is -3, then the output is 0. - If the input is +3, then the output is 3.0. ReLU is a very popular activation function. Despite its simple behavior, ReLU still enables a [neural network](https://wiki.g15e.com/pages/Artificial%20neural%20network.txt) to learn [nonlinear relationships](https://wiki.g15e.com/pages/Nonlinear%20relationship.txt) between [features](https://wiki.g15e.com/pages/Feature%20(machine%20learning.txt)) and the [label](https://wiki.g15e.com/pages/Label%20(machine%20learning.txt)).