Autoregressive describes models that generate output one element at a time, where each new element depends on all previously generated elements. The model "regresses" on its own previous outputs, hence "auto" (self) + "regressive."
All major LLMs (GPT, Claude, LLaMA) are autoregressive, generating text token by token.