Managing the Unexpected
High Reliability develops an organization’s strengths through individual actions.
Shared attitudes fill the gap between organization and the individual to determine High Reliability.
HRO Model Overview | Perrow/Complex Organizations | Mercer/Naval Aviation | High Reliability Organizations | Roberts & Libuser/Organizational Psychology | Weick and Sutcliffe/Social Psychology | Slagmolen/Change | van Stralen/Neuropsychology | HRO & Normal Accident Theory
Amongst academicians, the source of uncertainty is important whether it is technical, organizational, or social. Uncertainty makes engineering difficult and occasionally unsuccessful. With new technology there will be technical uncertainties that cannot be resolved. Uncertainty makes social interactions for problem solving challenging and satisfying. Uncertainty drives self-organization. NAT offers society the juncture to accept the uncertainty, bring resources to bear, or not use the technology. HRO offers society a challenge to move forward, this is part and parcel of High Reliability.
As practitioners, we see this NAT and HRO as parts of the whole – how can we contain, react to, or interact with any uncertainty? Also, we see High Reliability as a collective response of individuals sharing common attitudes. Uncertainty, by definition, cannot be prepared for so the High Reliability team prepares itself to function in the presence of uncertainty as daily operations. We are not ignoring arguments either way; we only believe they are specious. If a technology is needed can we respond to its uncertainties (HRO arguments)? If a technology is not needed, and we cannot respond to its uncertainties, why assume the risk (NAT arguments)? The discussion becomes “How much can we respond to?” This may not be known until it is too late.
The danger is, at one extreme, not knowing the dangers to the extent possible and, at the other extreme, the mindset of “I told you so.” Either serves as death knell to innovation, one to creating innovation, the other to its use. Failure is not as much the accident but failure to identify the accident sufficiently early in its birth. Early identification, in this covert state, allows interaction with fewer resources and greatest effectiveness. Before the accident HRO gives the risk of cockiness and after the accident NAT gives the risk of blame. Either is deadly.
HRO and NAT arguments sometimes fall into the “top down” or “bottom up” discussion. Those who live with High Reliability see it as more of centripetal and centrifugal flow toward and away from the event. Information flows away from the event, toward the organization’s center for processing. Action moves toward the event, away from central authority, to interact with the situation. We learn what works through action and the response to our actions tells us the structure of the problem. There is no top down or bottom up, only towards and away.
In this setting, we do not break rules. Rules are written in blood. Of greatest importance is to identify, and identify quickly, if the rule, or any rule, does not apply, or if it competes or conflicts with other rules. Then, we quickly move to interactive judgment.
NAT is the environment of HRO. Society must judge the value of the technology, something that is not a zero sum game, reached by equation, or decided by self-described elder statesmen.