Backpropagation is the algorithm that computes gradients of the loss function with respect to each weight in the network. By applying the chain rule backwards through layers, it efficiently determines how each weight should change to reduce the loss.
Backpropagation is the foundation of how all neural networks learn, including LLMs with billions of parameters.