Leaky ReLU definition :

 

The Leaky ReLU activation function is defined as :

 

f( x ) = max( 0,01x, x )

 

Here the important thing to note is that when x is below 0 its derivative will become equal to 0 allowing as such the neuron in the model to potentially reactivate in order to improve the overall learning performance.