(also backward propagation of errors, backprop)
Backpropagation is an algorithm used in machine learning. It is applied in training feedforward neural networks (artificial neural networks wherein a circle is not formed out of connections between nodes) as well as other parameterized networks that contain differentiable nodes. Backpropagation is an example of efficient application of the Leibniz chain rule to such networks.
A method employed for training artificial neural networks in the context of supervised learning. Backpropagation computes the gradient of the loss function concerning each weight utilizing the chain rule, determining the gradient for each layer sequentially and iterating in reverse from the final layer to reduce the discrepancy between the predicted and actual outputs. This modification of weights enables the neural network to enhance its learning capabilities and overall performance.
See also: artificial intelligence, machine learning
Gradient descent and its variants, like stochastic gradient descent (SGD) and mini-batch gradient descent, are optimization algorithms that use backpropagation to compute gradients. Other optimization algorithms, like genetic algorithms and particle swarm optimization, do not rely on backpropagation for training neural networks.
We value your privacy