Human error refers to actions or decisions that fail to achieve their intended outcome, usually classified by psychologist James Reason (1990) into three types:
- Slips — execution errors (correct plan, wrong action)
- Lapses — memory errors (correct plan, forgotten step)
- Mistakes — planning errors (wrong plan, correct execution)
The term is somewhat misleading, because most "human errors" in complex systems arise from predictable failure modes that could be anticipated and mitigated through design. Reason and other safety researchers have argued for replacing the blame-oriented view ("human error caused this") with a system view ("the system allowed this error to reach the patient"). Sidney Dekker's Field Guide to Understanding Human Error is the definitive treatment of this reframing.
Key insights from human error research:
- Errors are predictable — they arise from specific cognitive bottlenecks
- Training cannot eliminate them — even experts make errors under pressure
- Individual blame is unhelpful — it stops investigation before the systemic causes are found
- Defence in depth — multiple imperfect layers catch what any single one misses
- Error-tolerant design — systems should fail safely, allow recovery, and prevent propagation
The aviation "just culture" approach, now spreading to healthcare and software engineering, explicitly separates blame from investigation: incidents are analysed to prevent recurrence, not to punish the person closest to the failure. This cultural shift, as much as any specific design technique, represents the mature human factors response to error.
Related terms: Slip, Lapse, Mistake, Reason's Swiss Cheese Model
Discussed in:
- Chapter 10: Design Laws from Aviation and Engineering — Error Tolerance and Fail-Safe Design
Also defined in: Textbook of Usability