Playing Mental Defense Against Accidents

Accidents often result from a chain of poor decisions and to prevent them, we need to learn how to break a bad link in our thinking before its too late. Heres how.

0

Most pilots know a majority of mishaps can be traced back to a chain of events that, if broken, would have prevented the accident from happening. No doubt some could have been prevented if the pilot had been able to press pause, or activate an airframe parachute, when the flight started going south. While most of us dont have that option, we can pull a mental ripcord of sorts and stop a progression of poor decision-making from becoming catastrophic.

288

While most pilots are familiar with the chain-of-events explanation as to how accidents happen, little time is spent trying to figure out how to break the chain, particularly when it involves a form of distorted decision-making. That is, of course, easier said than done. If it were an easy task, we wouldnt spend as much time reading about perfect-performing airplanes operating in ordinary circumstances plowing into terrain with tragic results. One place to start is to look at the chain of mental events leading up to an accident, rather than just the aerodynamic aspects of the flight. Doing so can provide some helpful insights toward identifying the bad link before it gets added to the chain.

Case History

Take, for example, the case of a Gulfstream III, on a positioning flight from Dallas Love Field to Houstons Hobby field (HOU) on November 22, 2004. In this fatal accident, skewed decision-making proved deadly. The twin-engine jet, with two very experienced crewmembers, crashed about three miles short of the field while shooting the Runway 4 ILS to minimums in low-vis conditions. The accident investigation revealed no performance-related problems with the aircraft. Instead, the lead event in the fatal crash was the crews not-uncommon misuse of the field VOR as the primary nav source for the ILS approach until about one minute prior to colliding with terrain.

As detailed in the NTSB report, at about seven miles out, the captain flying stated, as captured on the cockpit voice recorder, that he couldnt get approach mode displayed on his electronic attitude director indicator. Thirty seconds later, after the landing gear was extended, the first officer questioned his display as well. One minute later the captain queried the FO, “What did you do to me?” The question was prompted because the FO had just switched both multi-function displays primary nav source from the VOR to the ILS, as it should have been in the first place.

After the captain asked if the co-pilot had switched his frequency, the FO replied, “Were all squared away now.” Concerned, the captain said, “I dont know if I can get back on it [the ILS] in time.” The FO replied, “Yeah, you will. Youre squared away now.” This colloquy presumably occurred with the localizer and glideslope indicators deflected near full right and high after the ILS was tuned. The GIII continued its descent below minimums until it hit a light pole before impacting terrain. The accident killed the pilots and a flight attendant also on board.

Cognitive Dissonance

Obviously this accident chain is readily discernible, particularly given the number of bad links in it. So, what allowed the operators current chief pilot (the FO) and 19,000-hour former chief pilot (the captain) to drive a perfectly good GIII into the ground? One answer lies not in the planes flight management system but with the crews mental processing of discordant information, which was likely inhibited by what psychologists call “cognitive dissonance.” For those who slept through Psych 101, this is a fancy term for the way the brain resolves conflicting thoughts and their resulting emotions.

Behind the concept of cognitive dissonance is the notion that people do not like to have dissonant or conflicting thoughts in their mind. Most of us would rather avoid the psychological pain of mental conflict. An everyday example would involve a person who smokes and knows doing so carries a risk of lung cancer. To alleviate that psychological pain, he or she can rationalize that quitting would cause a weight gain, which would not be healthy, either. The mental conflict minimized, the person continues to smoke.

As applied to aeronautical decision-making, cognitive dissonance can play havoc. Its probably not an exaggeration to consider it a ticking time bomb in your head. Once youre aware of it, it needs to be disarmed. Heres an easy example illustrating how cognitive dissonance arises.

Making Decisions

During a long flight back to your home base, you realize youre low on fuel as a result of higher-than-forecast headwinds. Instead of making an intermediate fuel stop-which would eat up time and keep you from getting home before dark-you press on out of the desire to get the bird back in its hangar. As you do, you say to yourself “with a little power reduction and aggressive leaning, Ill be able to make it.” In this case, the conflicting thought of low fuel and getting there on time were resolved with a little mental whitewash.

While the cognitive dissonances insidiousness is easily seen in this example, the question is how can it be recognized and countered in real-time situations? A successful defense against it has just two elements. The first is knowing when youre vulnerable. The second is objective, honest decision-making. This defense is graphically described in the flowchart above, which may provide a helpful construct and practical application to the real world.

Remember, cognitive dissonance comes into play by definition only when conflicting thoughts or circumstances arise. This will happen when a circumstance or event creates what well call an out-of-the-box condition. This is simply a situation where you are operating outside standard parameters, defined by such items as your personal minimums, experience, the airplanes flight manual or the FARs, to name a few. Its likely to be present anytime your stress level rises unexpectedly or emotions surface.

The exact boundaries of the Normal Operations Box depicted in the flow chart will, of course, vary depending on ones piloting skills, experience, equipment and personal minimums, to name a few. The point is simply to know the boundary lines of what you consider normal operating conditions so you can discern an abnormal condition or event presenting elevated risk.

Putting It Into Practice

So, in practice, my Normal Operations Box for an ILS approach may be cloud bases of 500 feet and two miles visibility. If I get to 400 feet and Im still in the soup, Im out of my standard-operating box, and at this decision time know Ill need to play mental defense against cognitive dissonance.

In hindsight, its fairly easy to pick out the point a pilot detects being outside the box. Its usually accompanied by statements or thoughts of “thats strange,” “that doesnt look right,” “whats it doing now?” Other indicators of being outside the box would be destination weather below minimums, multiple missed approaches, a flagged navaid, abnormal engine performance, illuminated annunciator light or the time-honored uttered expletive.

Its at this point-honestly recognizing one is out of the Normal Operations Box-that the mental defense against cognitive dissonance needs to go up. It doesnt necessarily mean an immediate course reversal but it does call for an immediate time-out to assess the situation. Its during the process of determining the next course of action when you must guard against the undesirable ways the mind removes conflicting thoughts. The most common responses are to deny the problem exists, to minimize or trivialize it, rationalize the circumstances to justify acceptance of the increased risk, or delay a decision that might resolve the conflict.

The point is when choosing a corrective or alternative plan of action, the decision must be objective and not clouded by emotion or mental deceptions, or by dishonesty. Thus, in the case of my ILS approach down to 400 feet, if Im gong to adhere to my preset operating practices, I must go missed. Its at this point, however, I must be on guard for the thought that so far everythings looking good, Ill just drop down another 200 feet since its legal anyway so I can avoid the hassle of driving home from another airport. By doing so, I have just allowed creation of the first bad link in a potential accident chain. Going missed ensures the link is not created (assuming you recently practiced a few missed approach procedures).

Back To Houston

To see how this construct would have helped the GIII crew, lets go back inside the cockpit and pick up their approach into Hobby field. For a professional flight crew, their box is defined in part by the companys standard operating procedures and well-maintained equipment, as well as thousands of hours of flight time. Its a standard operating box, however, that would have provided a safe approach to minimums, even in the dark with a low RVR and broken clouds at 100 feet. Yet on this flight millions of dollars of metal and avionics were of no use simply because the crew got out of the box while turning onto the localizer without the ILS frequency tuned on Nav1. Indeed, the captain became aware of this fact, almost two minutes before impact, when he said, “I cant get approach mode on my thing.” At this point, mental discipline must kick in with: “Im out of the box, and I will not continue the approach until Im safely back in it.” At a minimum, the crew needed to level off for a few seconds to troubleshoot the situation and pinpoint exactly where they were, and if not able to get back in the box immediately, call for a missed approach without delay.

While we cannot know exactly what was going through the FOs mind, one distinct possibility was the presence of cognitive dissonance. This is evidenced by his reluctance to admit his error of failing to put the ILS on Nav1. So too, a missed approach would have delayed their arrival, and, on this particular flight, this may have been weighing more heavily than usual. They were inbound to Houston to pick up the first former President Bush and his wife. The motivation for an on-time arrival-even via an out-of-the-box approach-is readily evidenced in the FOs discounting the risk of continuing the approach. Yet, he encouraged the captain to do so even after the captain said, “I dont know if I can get back on it in time.” The FO replied, “Yeah you will…youre all squared away now.” Clearly an example of denial, to say the least.

What Are You Thinking?

The GIII into Hobby is just one example of many accidents where cognitive dissonance likely played a role. In order to better grasp the concept, however, review in your own mind the times you got too close to the edge of your personal flight envelope, and recall what you were thinking. Theres a good chance some of the thoughts were rationalizations or insidious deceptions geared to resolve the mental conflict and downplay the risks of pressing on notwithstanding the extra risk. That done, rerun the mental tape of that event with the template of what should have constituted a standard operating environment. At the first point in your flight where it deviated from the norm is the place to have bailed out or reconfigured to get back inside the box and thus prevent the clouded decision-making that can arise from cognitive dissonance.

While decision-making clouded by cognitive dissonance cannot explain every accident, it is likely more common than many realize. With that in mind, avoiding an accident may require breaking the mental links in the chain, not just those involving flight dynamics gone awry. To do so, one must maintain a watch for the first indication some aspect of a flight has annunciated abnormal. That accomplished, the opportunity to guard against the evils of cognitive dissonance is made easier. Moreover, it may just be what it takes to break a lengthening chain of risk before it gets wrapped around your neck.

LEAVE A REPLY

Please enter your comment!
Please enter your name here