Scaled ELU Activation Function

Scaled Exponential Linear Unit activation function is a modified ELU.

 x: an input data point

In general and .



# Scaled Exponential Linear Unit
def s_elu(x, a=1.6733, t=1.0507):
    result = []
    for i in x:
        if i<0:
            i = t*(a*(np.exp(i)-1))
    return result

y = s_elu(x, a=1)
plot_graph(x, y, 'Scaled Exponential Linear Unit') 


        Used in hidden layers and can be used as an alternative to ReLU.



SELU induces self-normalizing property to the neural networks. That is, the neuron activations converge towards zero mean and unit variance.


It isn’t affected by vanishing and exploding gradient problems.



Need more computation power while training the network.


Share on twitter
Share on facebook
Share on linkedin
Share on whatsapp
Share on reddit
Share on telegram
Share on pinterest
Share on email
Share on facebook

Readers who read this also read

Close Menu