Skip to main content Accessibility help
×
×
Home
  • This chapter is unavailable for purchase
  • Cited by 6
  • Cited by
    This chapter has been cited by the following publications. This list is generated based on data provided by CrossRef.

    Hummels, Harry 1999. Ethical challenges in a technological environment: The perspective of engineers versus managers. Science and Engineering Ethics, Vol. 5, Issue. 1, p. 55.

    Busby, J. S. and Strutt, J. E. 2001. The derivation of hazard criteria from historical knowledge. Journal of Engineering Design, Vol. 12, Issue. 2, p. 117.

    Curran, Evonne T 2013. Outbreak Column 11: Consequences of outbreaks; lessons for healthcare workers and infection prevention and control teams. Journal of Infection Prevention, Vol. 14, Issue. 6, p. 220.

    Svensson, Ann 2014. The Use of Information Systems in Professional Healthcare Work Practices. International Journal of Sociotechnology and Knowledge Development, Vol. 6, Issue. 1, p. 51.

    Larson, David B. Kruskal, Jonathan B. Krecke, Karl N. and Donnelly, Lane F. 2015. Key Concepts of Patient Safety in Radiology. RadioGraphics, Vol. 35, Issue. 6, p. 1677.

    Siewert, Bettina Hochman, Mary Eisenberg, Ronald L. Swedeen, Suzanne and Brook, Olga R. 2018. Acing the Joint Commission Regulatory Visit: Running an Effective and Compliant Safety Program. RadioGraphics, Vol. 38, Issue. 6, p. 1744.

    ×
  • Print publication year: 1990
  • Online publication date: June 2012

7 - Latent errors and systems disasters

Summary

In considering the human contribution to systems disasters, it is important to distinguish two kinds of error: active errors, whose effects are felt almost immediately, and latent errors whose adverse consequences may lie dormant within the system for a long time, only becoming evident when they combine with other factors to breach the system's defences (see Rasmussen & Pedersen, 1984). In general, active errors are associated with the performance of the ‘front-line’ operators of a complex system: pilots, air traffic controllers, ships’ officers, control room crews and the like. Latent errors, on the other hand, are most likely to be spawned by those whose activities are removed in both time and space from the direct control interface: designers, high-level decision makers, construction workers, managers and maintenance personnel.

Detailed analyses of recent accidents, most particularly those at Flixborough, Three Mile Island, Heysel Stadium, Bhopal, Chernobyl and Zeebrugge, as well as the Challenger disaster, have made it increasingly apparent that latent errors pose the greatest threat to the safety of a complex system. In the past, reliability analyses and accident investigations have focused primarily upon active operator errors and equipment failures. While operators can, and frequently do, make errors in their attempts to recover from an out-of-tolerance system state, many of the root causes of the emergency were usually present within the system long before these active errors were committed.

Recommend this book

Email your librarian or administrator to recommend adding this book to your organisation's collection.

Human Error
  • Online ISBN: 9781139062367
  • Book DOI: https://doi.org/10.1017/CBO9781139062367
Please enter your name
Please enter a valid email address
Who would you like to send this to *
×