Parameterized ReLU or Parametric ReLU activation function is a variant of ReLU. It is similar to Leaky ReLU, with a slight change in dealing with negative input values.
x: an input data point
If then PReLU becomes ReLU. While the positive part is linear, the negative part of the function adaptively learns during the training phase.
# Parametric ReLU def param_relu(x, a=0.1): result =  for i in x: if i<0: i = a*i result.append(i) return result y = param_relu(x, a=0.1) plot_graph(x, y, 'Parametric ReLU')
Though it is treated as an alternative to ReLU, Parametric ReLU can be used in large scale image recognition tasks.
This function induces a learnable parameter which controls the negative slope. Also utilises the properties of unboundedness and boundedness of ReLU.
Though the lower bound parameter induce variation, the one-sided saturation doesn’t lead to better saturation.