Gradient descent is the optimization algorithm that updates model weights to minimize the loss function. It moves weights in the direction opposite to the gradient (steepest decrease), iteratively finding better parameter values.
All modern neural network training relies on variants of gradient descent to learn from data.