Prompt Engineering Fundamentals / Prompt Optimization

Zero-Shot Chain of Thought

Intermediate [3/5]
Implicit reasoning CoT without examples

Definition

Zero-Shot Chain of Thought triggers step-by-step reasoning without providing any examples. By simply adding phrases like "Let's think step by step" to a prompt, you can significantly improve the model's reasoning ability on complex tasks.

This technique, introduced in the paper "Large Language Models are Zero-Shot Reasoners," demonstrates that models have latent reasoning capabilities that can be activated with simple prompting cues.

Key Concepts

  • Trigger phrases: Simple additions like "Let's think step by step"
  • No examples needed: Works without providing demonstrations
  • Token efficiency: Uses fewer tokens than few-shot CoT
  • Broad applicability: Works across many reasoning tasks

Examples

Without Zero-Shot CoT
Direct Question
Q: A farmer has 17 sheep. All but 9 run away. How many are left? A: 8 sheep are left. ← WRONG (common error)
Without CoT, the model may jump to an incorrect answer.
With Zero-Shot CoT
Step-by-Step Trigger
Q: A farmer has 17 sheep. All but 9 run away. How many are left? Let's think step by step. A: Let me work through this carefully: - The farmer starts with 17 sheep - "All but 9 run away" means 9 sheep remain - The sheep that ran away = 17 - 9 = 8 sheep - The sheep remaining = 9 sheep 9 sheep are left. ✓ CORRECT
The trigger phrase activates careful reasoning.

Interactive Exercise

Try Different Triggers

Write 3 different phrases that could trigger step-by-step reasoning:

Pro Tips
  • "Let's think step by step" is the most studied trigger phrase
  • Works best on arithmetic, logic, and multi-step reasoning
  • Combine with other techniques for even better results
  • May increase token usage due to longer responses

Related Terms