Softsign Activation Function

A Softsign is another activation function that is used in neural networks.

x: an input data point

|x|: the absolute value of x

It is similar to the Tanh function, where Tanh grows exponentially and Softsign grows polynomially.


Range: (-1, 1)

# Softsign
def softsign(x):
    return (x/(np.abs(x)+1))

y = softsign(x)
plot_graph(x, y, 'Softsign') 

Use cases:

Used in regression problems and in applications of text to speech  systems.


Considered as an alternative to Tanh function since it doesn’t saturate easily i.e during larger or lower values the function doesn’t tend to be linear easily than Tanh function.

Since it is zero centered, the network learns effectively.


Produces dead neurons when the input values are 0. And prone to gradient saturation.


Share on twitter
Share on facebook
Share on linkedin
Share on whatsapp
Share on reddit
Share on telegram
Share on pinterest
Share on email
Share on facebook

Readers who read this also read

Close Menu