What are Activation Functions?
Activation functions are used in 3rd step in the image below.
Types of Activation Functions
There are 5 types of activation functions:
Threshold function
The threshold function returns 0 if the value is less than 0, and returns 1 if the value is greater than 0. It’s a type of yes/no function. To express it in a formula:
And the graph will be:
Sigmoid function
The sigmoid function is useful in the final layer (output layer), especially when predicting probabilities. To express it in a formula:
And the graph will be:
Rectifier function (ReLU)
The rectifier function is one of the most popular functions for ANN. It returns 0 if the value is less than 0, and returns the value if the value is more than 0. To express it in a formula:
And the graph will be:
Hyperbolic Tangent(tanh) function
It is similar to the sigmoid function, but it goes below 0. To express it in a formula:
And the graph will be:
Mish Activation
Mish avoids saturation due to capping because the graph tends towards positive infinity. Also, it can decrease overfitting, and strong regularization may appear because it is bounded below.
SiLU Activation
The SiLU activation function avoids saturation due to capping because the graph tends towards positive infinity. It can also decrease overfitting, and strong regularization may appear because it is bounded below.