
The sigmoid function
Sigmoid is a smooth and continuously differentiable function. It results in nonlinear output. The sigmoid function is represented here:

Please, look at the observations in the following graph of the sigmoid function. The function ranges from 0 to 1. Observing the curve of the function, we see that the gradient is very high when x values between -3 and 3, but becomes flat beyond that. Thus, we can say that small changes in x near these points will bring large changes in the value of the sigmoid function. Therefore, the function goals in pushing the values of the sigmoid function towards the extremes.
Therefore, it's being used in classification problems:

Looking at the gradient of the following sigmoid function, we observe a smooth curve dependent on x. Since the gradient curve is continuous, it's easy to backpropagate the error and update the parameters, that is, and
:

Sigmoids are widely used but its disadvantage is that the function goes flat beyond +3 and -3. Thus, whenever the function falls in that region, the gradients tends to approach zero and the learning of our neural network comes to a halt.
Since the sigmoid function outputs values from 0 to 1, that is, all positive, it's non symmetrical around the origin and all output signals are positive, that is, of the same sign. To tackle this, the sigmoid function has been scaled to the tanh function, which we will study next. Moreover, since the gradient results in a very small value, it's susceptible to the vanishing gradient problem (which we will discuss later in this chapter).