Parameterized Relu Activation Function

Parameterized ReLU or Parametric ReLU activation function is a variant of ReLU. It is similar to Leaky ReLU, with a slight change in dealing with negative input values.

  x: an input data point

If  then PReLU becomes ReLU. While the positive part is linear, the negative part of the function adaptively learns during the training phase.

Range: ()

# Parametric ReLU
def param_relu(x, a=0.1):
    result = []
    for i in x:
        if i<0:
            i = a*i
        result.append(i)
    return result

y = param_relu(x, a=0.1)
plot_graph(x, y, 'Parametric ReLU') 

Use cases:

Though it is treated as an alternative to ReLU, Parametric ReLU can be used in large scale image recognition tasks.

Pros:

This function induces a learnable parameter  which controls the  negative slope. Also utilises the properties of unboundedness and boundedness of ReLU.

Cons:

Though the lower bound parameter  induce variation, the one-sided saturation doesn’t lead to better saturation.

Share

Share on twitter
Share on facebook
Share on linkedin
Share on whatsapp
Share on reddit
Share on telegram
Share on pinterest
Share on email
Share on facebook

Readers who read this also read

Close Menu