Rectified linear unit
An activation function with the following behavior:
- If input is negative or zero, then the output is 0.
- If input is positive, then the output is equal to the input.
For example:
- If the input is -3, then the output is 0.
- If the input is +3, then the output is 3.0.
ReLU is a very popular activation function. Despite its simple behavior, ReLU still enables a neural network to learn nonlinear relationships between features and the label.