A cognitive bias is a systematic pattern of deviation from rational decision-making — a consistent, predictable way in which human judgement departs from what an ideally rational agent would do. Cognitive biases arise from the heuristics and shortcuts the mind uses to cope with limited time, information, and processing capacity.
The study of cognitive biases was transformed by Daniel Kahneman and Amos Tversky, whose Nobel-winning research in the 1970s and 1980s identified dozens of biases and integrated them into a theory of thinking with two systems: fast, automatic, intuitive "System 1" and slow, effortful, deliberative "System 2". Most biases arise from System 1 shortcuts that usually work but sometimes fail.
Biases particularly relevant to interface design:
- Anchoring — initial information disproportionately influences judgement
- Default effect — users overwhelmingly accept default options
- Framing effect — equivalent descriptions produce different choices
- Satisficing — choosing the first adequate option rather than optimising
- Confirmation bias — favouring information that confirms existing beliefs
- Availability heuristic — judging frequency by ease of recall
- Peak-end rule — remembering experiences by their peak and end
- Automation bias — over-accepting computer recommendations
- Loss aversion — losses feel larger than equivalent gains
Good design acknowledges cognitive biases rather than fighting them. Safe defaults exploit the default effect benevolently. Clear framing helps users understand choices. Recognition-based interfaces work with satisficing. Dark patterns exploit the same biases against users' interests — the line between ethical choice architecture and manipulation is drawn by whose interest the design serves.
Related terms: Anchoring, Default Effect, Framing Effect, Satisficing, Choice Architecture, Bounded Rationality
Discussed in:
- Chapter 4: Attention and Decision-Making — Decision-Making and Cognitive Biases
Also defined in: Textbook of Usability