System 2 Attention (S2A) is a technique where the LLM first regenerates or refines the context to remove irrelevant or misleading information before answering. Named after Kahneman's System 2 (deliberate, analytical thinking), it combats the model's tendency to be distracted by irrelevant context.
S2A addresses the "distracted by context" problem where LLMs may give different answers based on irrelevant information included in the prompt.