Backpropagation
The algorithm used to compute how each model parameter contributed to error so the network can update itself during training.
Backpropagation is the method neural networks use to learn. After the model makes a prediction, the system measures how wrong it was using a loss function, then propagates that error backward through the network to calculate how each weight should change.
This process produces gradients, which tell the optimizer whether each parameter should increase or decrease to reduce future error. Without backpropagation, modern deep learning would not be practical.
Backward pass: calculate blame for the error across the whole network.
Update: adjust weights to do a little better next time.
Why Backpropagation Matters
- Efficient learning β updates millions or billions of parameters systematically
- Scalable training β works across deep multi-layer networks
- Core deep learning method β used in language, vision, and audio models
- Enables optimization β provides the gradients needed for optimizers
Backpropagation works hand-in-hand with gradient descent and hardware acceleration. It is one of the core reasons neural networks became trainable at the scale seen in modern AI systems.