In considering the human contribution to systems disasters, it is important to distinguish two kinds of error: active errors, whose effects are felt almost immediately, and latent errors whose adverse consequences may lie dormant within the system for a long time, only becoming evident when they combine with other factors to breach the system's defences (see Rasmussen & Pedersen, 1984). In general, active errors are associated with the performance of the ‘front-line’ operators of a complex system: pilots, air traffic controllers, ships’ officers, control room crews and the like. Latent errors, on the other hand, are most likely to be spawned by those whose activities are removed in both time and space from the direct control interface: designers, high-level decision makers, construction workers, managers and maintenance personnel.
Detailed analyses of recent accidents, most particularly those at Flixborough, Three Mile Island, Heysel Stadium, Bhopal, Chernobyl and Zeebrugge, as well as the Challenger disaster, have made it increasingly apparent that latent errors pose the greatest threat to the safety of a complex system. In the past, reliability analyses and accident investigations have focused primarily upon active operator errors and equipment failures. While operators can, and frequently do, make errors in their attempts to recover from an out-of-tolerance system state, many of the root causes of the emergency were usually present within the system long before these active errors were committed.
Review the options below to login to check your access.
Log in with your Cambridge Aspire website account to check access.
If you believe you should have access to this content, please contact your institutional librarian or consult our FAQ page for further information about accessing our content.