 # Sigmoid Activation Function

A Sigmoid Function is also known as the Logistic Function or Squashing Function. It is a nonlinear function mostly used in feedforward neural networks. x: an input data point Range: (0, 1)

``````# Import Numpy Library
import numpy as np

#Sigmoid Function
def sigmoid(x):
return 1/(1 + np.exp(-x))  ``````

Use cases:

Applicable in binary classification problems like Logistic Regression tasks.
Coming to Neural Networks, Sigmoid function is used in shallow networks.
Recurrent Networks, probabilistic models and some autoencoders make use of the Sigmoid function.

Pros:

Whatever the input values(real numbers) be, the function returns an output in the range of (0, 1).
The smoothness of the function doesn’t overshoot the gradient and provides better optimization results.

Cons:

When the input values are very high or very low, we can observe that the function tends to be linear. This linearity causes a zero gradient.
Suffers from vanishing gradient. As the depth of the network increases the gradients approaches to 0 during backpropagation.
Slow convergence and non-zero centered outputs cause the gradient updates in different directions.

There are other variants in Sigmoid Function namely:

1. Hard Sigmoid
2. Sigmoid-Weighted Linear Units(SiLU)
3. Derivative of Sigmoid-Weighted Linear Units(dSiLU)

## Share

### Identity Function

An Identity Function or a Linear Function in which the output remains the same