Optimization algorithms
Optimization algorithms are widely utilized mathematical functions that solve problems via the maximization or minimization of a function. These algorithms are used for a variety of purposes from patient scheduling to radiology.
Machine learning
Optimization algorithms are used in machine learning to reduce a function known as a loss function or error function. By minimizing the loss function, optimization algorithms can achieve minimal or no difference between actual and predicted output, making our model more accurate at a task.
Neural networks
In a neural network, optimization is typically done using backpropagation. The current error or loss is propagated backwards to the previous layer, the gradient is computed, and one of the optimization algorithms is used to perform learning using this gradient. To simplify, learning refers to minimizing the loss function.
There are different types of optimization algorithms used in neural networks:
- gradient descent variants
- batch gradient descent
- stochastic gradient descent
- mini-batch gradient descent
- momentum
- Nesterov accelerated gradient (Nesterov momentum)
- algorithms with adaptive learning rates
- AdaGrad
- AdaDelta
- RMSprop
- Adam