ReLu definition :
The Relu is an activation function defined as follows :
f( x ) = max( 0 , x )
which simply returns its argument x whenever its greater than zero and 0 otherwise.
Also, one of the major advantage of the ReLU activation function is to allow the back-propagation of the error signal while still maintaining a non-linearity
characteristics which enables it to behave analogously to a biological neuron and therefore model complicated non-linear functions.