Activation function
In neural networks, activation functions perform a transformation on a weighted sum of inputs plus biases to a neuron in order to compute its output. Using a biological analogy, the activation function determines the “firing rate” of a neuron in response to an input or stimulus. These functions introduce non-linearities into the neural networks enabling them to perform complex tasks such as image recognition and language processing. Without non-linear activation functions, artificial neural networks behave as simple linear regression models.
Such functions include:
- sigmoid function
- rectified linear unit (ReLU) function
- hyperbolic tangent (Tanh) function