Why Preventable Medical Errors Happen: The Hidden Interaction Between Human and System Failures – A Systematic Review

Preventable medical errors remain one of the most persistent and complex problems in modern healthcare. Decades of research show that harm rarely results from a single mistake. Instead, errors emerge from a complex system of interconnected factors involving people, technology, organizations, and environments.


Scientific research shows both human factors and system-level failures play major roles in medical errors, often interacting in complex ways. Studies find that 60–80% of errors involve human factors at the point of care, while 65–90% involve underlying other system conditions that enable or precipitate those mistakes [1–6]. These studies highlight an important reality: human errors often occur because the system makes them more likely.
Understanding this interaction is essential for improving patient safety.

Method


Healthcare is one of the most complex operational environments in modern society. Clinicians must make rapid decisions under uncertainty, coordinate across teams, interpret incomplete information, and manage sophisticated technologies.


In the writing of this post, more than 500 academic papers were initially identified, with 135 meeting inclusion the following criteria for detailed analysis.  This initial analysis was conducted with the AI research tool Elicit.
·      studies needed to involve healthcare professionals engaged in patient care
·      Studies had to include empirical data or systematic analysis, not just theory
·      the study needed to evaluate system-level contributors to error, rather than focusing exclusively on patient characteristics such as genetics, disease progression or treatment resistance


The research included examined the issue through multiple means, including:
• retrospective chart reviews [1,7,24–32]
• malpractice claims analysis [5,8,9,33–36]
• prospective observational studies [10,14,37–46]
• systematic reviews and meta-analyses [17,47–63]
• incident reporting system analysis [2,6,12,16,21,23,64–72]

Human Factors in Medical Errors


Human factors refer to the cognitive, behavioral, and interpersonal elements that influence decision-making and performance in complex environments.


Across the literature, several categories of human error repeatedly emerge.


Cognitive and Diagnostic Errors


Cognitive errors are among the most frequently identified contributors to patient harm. These include failures in reasoning, judgment, and interpretation. Examples include:
• premature closure in diagnosis
• faulty synthesis of clinical information
• incorrect interpretation of symptoms
• confirmation bias in clinical reasoning


Studies examining diagnostic errors show that cognitive mistakes are among the leading contributors to preventable harm [4,7,8]. These errors often occur under conditions of uncertainty or time pressure. When clinicians must rapidly synthesize large amounts of information, the risk of faulty reasoning increases.


Situational Awareness Failures


Situational awareness refers to the ability to perceive, understand, and anticipate developments in a dynamic environment. Breakdowns in situational awareness have been documented in:
• surgery
• emergency medicine
• intensive care
• anesthesia


Failures can occur when clinicians overlook critical patient data, misinterpret vital signs, or fail to recognize changes in a patient’s condition [11–13]. These breakdowns are particularly dangerous in high-acuity settings where rapid intervention is required.


Communication Breakdowns


Communication failures are among the most widely documented causes of medical error.
These can occur in many forms:
• incomplete patient handovers
• ambiguous verbal orders
• miscommunication between team members
• failures to escalate concerns


Research consistently shows that communication problems play a major role in surgical complications and diagnostic failures [5,9,10]. Healthcare teams are highly interdisciplinary. When communication structures fail, critical information may not reach the right clinician at the right time.


Teamwork and Coordination Failures


Healthcare delivery requires coordinated work across multiple professionals.
Errors can arise when:
• team roles are unclear
• leadership is ineffective
• collaboration breaks down
• individuals hesitate to challenge authority


Studies of surgical teams have shown that breakdowns in teamwork frequently precede adverse events [10].
These findings echo research from other high-risk domains such as aviation and nuclear power, where team coordination plays a critical role in safety.

The Role of System Factors


Although human factors tend to be obvious when an error occurs, the underlying causes often lie deeper within the healthcare system.


System factors include organizational structures, processes, technologies, and environmental conditions that shape clinician behavior.


Workflow and Process Failures


Poorly designed workflows are a common contributor to error. Examples include:
• unclear clinical protocols
• fragmented information systems
• inefficient handover procedures
• lack of standardized checklists


Research shows that process failures frequently create the conditions that allow individual mistakes to occur [8,10,14].
For example, an incomplete patient handover can leave clinicians unaware of critical information, increasing the risk of misdiagnosis or treatment error.


Workload and Time Pressure


Healthcare professionals often work under extreme workload pressures. High patient volume, staff shortages, and time constraints can lead to:
• rushed decision-making
• fatigue
• reduced attention to detail


These conditions significantly increase the likelihood of error [10,17,18].
In many cases, the problem is not clinician competence but a system that demands sustained high-stakes performance under unrealistic conditions.

Organizational Culture and Leadership


Organizational culture strongly influences how safety is managed. Weak safety cultures often feature:
• reluctance to report errors
• blame-focused responses to mistakes
• lack of learning from incidents


Conversely, strong safety cultures encourage reporting, transparency, and continuous improvement.
Research shows that management practices and policy decisions play a critical role in shaping these cultural dynamics [15,16].


Technology and Equipment Issues


Medical technology can both reduce and introduce risks.
Equipment failures, poor interface design, and complex device interactions can all contribute to errors. Examples include:
• confusing medication interfaces
• alarm fatigue from monitoring systems
• poorly designed electronic health records


In some cases, technology increases cognitive load rather than reducing it, inadvertently making errors more likely.

When Human and System Failures Interact


One of the most important insights from the research literature is that human and system factors are deeply intertwined. The same error may be interpreted in two different ways. For example, consider the case where a clinician misreads a medication dosage.

From one perspective, this is a human error.

From another perspective, the real cause may be:
• confusing medication labeling
• a poorly designed electronic ordering system
• excessive workload
• an interrupted workflow


In other words, the system created the conditions for the human error.

The Cascade Effect of Medical Errors


Many studies emphasize that medical errors often involve cascading failures across multiple system layers.
Rather than a single mistake, incidents frequently involve chains of events. For example:
1. A patient handover omits key information.
2. The receiving clinician misinterprets symptoms.
3. A diagnostic test is delayed.
4. Treatment is administered too late.


Each step alone may appear minor. Together, they can lead to severe harm. Research shows that multiple contributing factors are present in the majority of medical errors [4,21,22].
This finding has major implications for safety interventions.

Why Blaming Individuals Doesn’t Work


Historically, healthcare systems often responded to medical errors by focusing on individual blame.
However, the research reviewed in the report strongly suggests that this approach is ineffective.
If errors arise from systemic conditions, punishing individuals will not address the root cause. Instead, the evidence suggests that safety improvements must focus on:
• redesigning processes
• reducing cognitive workload
• improving communication systems
• strengthening safety culture
• addressing staffing and workload issues


In other words, improving the system rather than simply correcting the individual.
This systems approach has already transformed safety practices in other high-risk industries such as aviation and nuclear power. Healthcare is gradually adopting similar principles.

Implications for the Future of Patient Safety


The evidence on medical error points toward several key priorities for healthcare systems.


1. Design Safer Systems
Healthcare systems must be designed with human limitations in mind.
This includes:
• standardized protocols
• clearer workflows
• better user interfaces for medical technology
Human factors engineering can play a critical role in designing safer healthcare environments.


2. Reduce Cognitive Load
Clinicians often operate under extreme cognitive demands.
Reducing unnecessary complexity can help prevent mistakes.
Examples include:
• decision support systems
• simplified documentation processes
• improved information displays


3. Strengthen Safety Culture

Organizations must encourage open reporting of errors and near-misses.
Learning from mistakes is essential for preventing future harm.


4. Improve Team Communication
Structured communication tools, such as standardized handover protocols, have been shown to reduce error rates.


5. Address Workforce Pressures
Workload, staffing, and fatigue are major contributors to error risk.
Improving working conditions can significantly improve safety outcomes.

Conclusion


Preventable medical errors are rarely the result of a single mistake.Instead, they arise from a complex interaction between human behavior and system design. Research consistently shows that while human factors contribute to many errors at the point of care, underlying system failures frequently create the conditions that make those errors possible [1–6]. This insight fundamentally changes how patient safety should be approached.
Rather than focusing solely on individual performance, healthcare systems must address the broader organizational, technological, and environmental conditions that shape clinical decision-making.

  1. Cooper, J. B., Newbower, R. S., & Kitz, R. J. (1978).
    An analysis of major errors and equipmentfailures in anesthesia management: Considerations for prevention and detection.Anesthesiology, 49(6), 399–406.
  2. Hamad, D., et al. (2021).
    Analysis of preventable trauma deaths usingvoluntary reporting systems.
    Journal of Trauma and Acute Care Surgery.
  3. Bethune, R., Sasirekha, G., Saha, S., et al. (2015).
    Surgical adverse events and contributingfactors: A retrospective analysis.
    Annals of Surgery.
  4. Graber, M. L., Franklin, N., & Gordon, R. (2005).
    Diagnostic error in internal medicine.Archives of Internal Medicine, 165(13),1493–1499.
  5. Somville, F. J., et al. (2010).
    System and human factors contributing tosurgical malpractice claims.
    Annals of Surgery, 252(1), 120–126.
  6. Runciman, W. B., et al. (1993).
    A review of incident monitoring systemsin anesthesia.
    Quality and Safety in Health Care, 2(2),76–83.
  7. de Leval, M. R., et al. (2013).
    Human factors and cardiac surgeryoutcomes.
    The Lancet, 381(9861), 37–45.
  8. Singh, H., et al. (2007).
    Types and origins of diagnostic errors inprimary care settings.
    Archives of Internal Medicine, 167(17),1881–1887.
  9. Rogers, S. O., et al. (2006).
    Analysis of surgical errors in closedmalpractice claims.
    Surgery, 139(5), 632–640.
  10. Gawande, A. A., Zinner, M. J., Studdert, D. M., & Brennan, T. A. (2003).Analysis of errors reported by surgeonsat three teaching hospitals.
    Surgery, 133(6), 614–621.
  11. Uramatsu, M., et al. (2017).
    Analysis of adverse events and patientdeaths reported to a national safety reporting system.BMJ Open.
  12. McLennan, E., et al. (2025).
    Analysis of surgical safety incidentsusing the NOTSS taxonomy.
    Journal of Patient Safety.
  13. Ey, J. D., et al. (2025).
    Surgical mortality and preventabledeaths: A national audit analysis.
  14. Leape, L. L., et al. (1995).
    Systems analysis of adverse drug events.Journal of the American Medical Association(JAMA), 274(1), 35–43.
  15. Bahrami-Azar, G., et al. (2021).
    System factors influencing medical errorsin healthcare organizations.
    International Journal of Health Planning andManagement.
  16. Peerally, M. F., Carr, S., Waring, J., & Dixon-Woods, M. (2022).The problem with root cause analysis inhealthcare.
    BMJ Quality & Safety, 31(3),239–246.
  17. Tully, M. P., et al. (2009).
    Causes of prescribing errors in hospitalinpatients: A systematic review.
    Drug Safety, 32(10), 819–836.
  18. O’Connor, P., et al. (2019).
    Workplace pressures and medical erroramong junior doctors.
    BMJ Open.
  19. Gupta, A., et al. (2018).
    Understanding the system context ofmedical error using ethnographic methods.
    BMJ Quality & Safety.
  20. Carayon, P., et al. (2013).
    Work system design for patient safety:The SEIPS model.
    Quality and Safety in Health Care,15(Suppl 1), i50–i58.
  21. Thiels, C. A., et al. (2015).
    Retrospective analysis of surgical “neverevents” using root cause analysis and HFACS.
    Surgery, 158(2), 364–372.
  22. Lear, R., et al. (2017).
    Surgeons’ perceptions of factorscontributing to error in vascular surgery.
    BMJ Open.
  23. Reason, J. (2000).
    Human error: Models and management.BMJ, 320(7237), 768–770.

Continue reading...

Browse all articles