Leaky Relu Activation Function

Leaky ReLU is an activation function which induces a small negative slope to ReLU, to keep the neurons alive.

x: an input data point

 

Though Leaky ReLU isn’t much significant, it is better than ReLU, because it addresses the problem of dead neurons.

Range: 

# Leaky ReLU
def leaky_relu(x):
    result = []
    for i in x:
        if i<0:
            i = 0.01*i
        result.append(i)
    return result

y = leaky_relu(x)
plot_graph(x, y, 'Leaky ReLU') 

Use cases:

Used in feedforward neural networks. Can find the applications of Leaky ReLU in Automatic Speech Recognition systems.

 

Pros:

Address the problem of dying neurons by inducing a small negative slope and utilises the same boundedness and unboundedness properties of ReLU.

 

Cons:

No significant result improvement except in sparsity and dispersion.

 

Though the outputs form a zero-centered distribution, the one-sided saturation leading to convergence isn’t attained effectively.

Share

Share on twitter
Share on facebook
Share on linkedin
Share on whatsapp
Share on reddit
Share on telegram
Share on pinterest
Share on email
Share on facebook

Readers who read this also read

Close Menu