- Intelligent Projects Using Python
- Santanu Pattanayak
- 136字
- 2021-07-02 14:10:41
Sigmoid activation units
The output of the sigmoid activation unit, y, as a function of its total input, x, is expressed as follows:
![](https://epubservercos.yuewen.com/759B01/19470382808830006/epubprivate/OEBPS/Images/e72ceed8-eaaa-4d3e-bc39-256e36d389e2.png?sign=1739630305-UauOYptUg5bUKnrBRMewfSNZzGtseCej-0-f9a72895dd7d4340faaed453adafead9)
Since the sigmoid activation unit response is a nonlinear function, as shown in the following graph, it is used to introduce nonlinearity in the neural network:
![](https://epubservercos.yuewen.com/759B01/19470382808830006/epubprivate/OEBPS/Images/f54ce333-0cdb-400f-b9d1-22d60475291f.png?sign=1739630305-L5Gl9FS8my1ciPMGH9MVrtjk4SqyRSIu-0-9c5bc925d472f29d9162c58125117d6e)
Figure 1.6: Sigmoid activation function
Any complex process in nature is generally nonlinear in its input-output relation, and hence, we need nonlinear activation functions to model them through neural networks. The output probability of a neural network for a two-class classification is generally given by the output of a sigmoid neural unit, since it outputs values from zero to one. The output probability can be represented as follows:
![](https://epubservercos.yuewen.com/759B01/19470382808830006/epubprivate/OEBPS/Images/4e502901-d686-46cf-ba1c-e0b11d2c2707.png?sign=1739630305-1vBV69pKyNynyMGkXpX3ZuSD2cA9YfqO-0-eef14568cba87b0fd33aa6fa3b70c25c)
Here, x represents the total input to the sigmoid unit in the output layer.