Managing the Unexpected
High Reliability develops an organization’s strengths through individual actions.
Shared attitudes fill the gap between organization and the individual to determine High Reliability.
HRO Model Overview | Perrow/Complex Organizations | Mercer/Naval Aviation | High Reliability Organizations | Roberts & Libuser/Organizational Psychology | Weick and Sutcliffe/Social Psychology | Slagmolen/Change | van Stralen/Neuropsychology | HRO & Normal Accident Theory
As a sociologist, Charles Perrow contributed a sociological perspective organizational analysis and organizational behavior studies which had been heavily influenced by psychology. He studied decision making in centralized vs. decentralized organizations and presented a sociological view of the human-machine interface, particularly for decision making under varying abilities and demands. After the accident at the Three Mile Island nuclear power plant he became involved with the study of what happened leading to his description of the Normal Accident which he initially characterized as unpreventable and unanticipated therefore they cannot be trained for, designed against (Perrow, 1981). The Normal Accident has four characteristics:
1. Signals only noticed in retrospect;
2. Multiple design and equipment failures;
3. Some type of operator error which is not considered error until the accident is understood;
4. “Negative synergy” where the sum of equipment, design, and operator errors is far greater than he consequences of each singly.
His book, Normal Accidents (1984) introduced the idea that people will interact with complex technological systems to create whole, or unitary, systems. Interactions occur within these systems over two significant dimensions of interactive complexity and coupling of components. The complexity and coupling, whether by design or happenstance, determine the system’s susceptibility to accidents and make accidents not only inevitable but normal.
Accidents then derive not so much from human cognition and behaviors or by engineered designs but from dynamic human-machine interactions. While these accidents may be predicted and the risk accepted or identified in hindsight from missed data they may also result from information that is undiscoverable until events unfold. In effect, the surprise reflects not what is cognitively missed but what is cognitively absent.
Interactive complexity refers to unfamiliar, unplanned, or unexpected sequences of events in the system particularly at the level of the working environment. (The working environment is the place where the individual engages the organization’s external environment either at the executive level or operational level.) These events may not be immediately visible or comprehensible. The measure of interactive complexity is the number of ways in which parts and relationships of the system can interact. While interactions may be linear by design or only when a few interactions occur they easily become nonlinear as interactions increase in number or degree. Nonlinear interactions of only a few components can rapidly lead to complexity, new and unexpected properties of the system, and an accident.
Components of a system are joined together, or coupled, very loosely when the parts are not very dependent on each other. They are coupled tightly when the parts are highly interdependent, that is, linked to many parts in a time-dependent manner. In tightly coupled systems a change in one part rapidly affects the status of other parts and influences the system’s ability to recover. This allows small perturbations to rapidly cause large effects while in a loosely coupled or decoupled system the sparse links or less tightly linked links allow the absorption of perturbations and dampen destabilization. The speed at which one variable cascades through the system causing changes or the entrainment of other parts into the cascading events is a measure of how tightly coupled the system is.
The interactive complexity of the system is the trigger while tight coupling is the propagator or events leading to the normal or system accident. Simple principles, when combined in a nonlinear manner, lead to complexity and novel properties of the system. These novel properties may not be able to be anticipated.
Solutions proposed by Perrow (1999) include (1) abandon those systems where risks outweigh reasonable benefits, (2) where we can we make the system less risk despite considerable effort, and (3) enhance those systems with the characteristics of self-correction or self-organization.
The difficulty of using Normal Accident Theory as a guide to risk is that it increases the importance of possibility or the ease an event will happen vs. probability or the likelihood an event will happen.
Perrow C. 1999. Normal Accidents: Living with high-risk technologies. Princeton, NJ: Princeton University Press.
Perrow C. 1981. Normal accident at Three Mile Island. Society 18(5):17-26.