Doctor Earl’s 29th Law

In the 1980’s a revolution took place in the commercial airline industry. For the traveling public, this was a “quiet” event with few even aware of what was taking place. In those years, Boeing produced the first “glass” cockpit (computer driven) aircraft, the B757 and B767. From the cockpit door rearward, things looked new and shiny but not dramatically different from older aircraft. From the cockpit door forward, radical change had occurred in aircraft systems management. The round dial mechanical instruments with which every pilot was familiar were gone. In their place were large format flat plate computer screens that electronically reproduced instruments, gauges, and data in ways not thought possible before the advent of the computer. A FMS (Flight Management System) was the heart and soul of the system, allowing each pilot to control and navigate the aircraft by use of the FMS. Before the first of the 757/767’s arrived at Delta Airlines, we began training the pilots who would fly these aircraft. Quickly, we recognized a large and growing problem: Our pilots were failing their initial checkout at an alarmingly high rate, a rate never seen on our traditional “steam powered” (old technology) aircraft. These were not bad pilots. We knew the fault lay elsewhere. Thus began the involvement of Dr. Earl Wiener. Earl was a professor at the University of Miami, and a renowned NASA researcher. His real expertise was man-machine interfaces – how automation impacts working relationships within teams and how humans interact with technology. After analyzing the training program at Delta, Earl came to a profound conclusion – this was not a training issue per se; it was a design and leadership problem. The traditional manner in which our pilots were trained was failing to convey the significant differences in how the 757/767 had to be managed and flown. Further, we were not preparing our pilots with the skills and knowledge necessary to manage these aircraft. Our failure was not borne out of poor pilot skills but bad training design. We were trying to use a traditional training system for a very non-traditional aircraft. Throughout his career, Earl had conceived a series of simple rules that capture powerful truths. He called these wisdoms Wiener’s Laws. As explanation for our problem, he shared his 29th Law with us, a law I prefer to call the “Law of Unintended Consequences”: “Any time you solve a problem you create a problem. The most you can hope for is that the problem you solved is of greater magnitude than the one you created.” While these high tech “glass” aircraft afforded new and improved methods of operation, they simultaneously created new and different challenges for the pilots who flew them. We had not fully understood the critical need to prepare and train our pilots for the very different processes necessary to operate the 757/767. With Earl’s assistance, we created a 3-day introductory course called “Introduction to Aviation Automation” (IAA). Every pilot coming to training on the 757/767 first completed this program during which we specially discussed the differences between traditional and “glass” cockpits. The failure rate dropped dramatically. As we gained experience with the aircraft, we incorporated new insights into IAA. Let me illustrate the impact of the “Law of Unintended Consequences” with a real world story. One morning in April of 2002, six months after the tragedy of “911”, a Delta MD88 aircraft left its departure gate in Atlanta for a flight to Washington Reagan airport. This particular aircraft had undergone a significant modification to its IFF beacon. (On the morning of “911”, the hijackers turned the IFF beacons off. As a result, Air Traffic Control (ATC) lost electronic signals indicating the location and flight path of the hijacked aircraft.) A seemingly logical solution to this problem was to install an IFF override switch in the cockpit which, when activated by the pilots, would send a hijack warning to ATC, while preventing deactivation of the IFF until the aircraft landed. Our MD88 crew knew their aircraft had this modified system. However, they had not been trained on its use, and the system was not yet ready for trial. Further, they did not know the switch had been inadvertently left in the “ON” position by maintenance. As soon as the aircraft became airborne, unbeknownst to the crew, the IFF system began sending hijack warnings to Atlanta tower and departure control. Bedlam reigned in the tower. A decision was quickly made to have the crew return to Atlanta to sort things out on the ground. Skies were overcast with a solid cloud layer at 1500′. As the crew descended below the clouds during their landing approach, they saw two USAF F-16 fighters positioned off either wing! Had the crew failed to comply exactly with ATC instructions, this could have been the first shoot-down of a commercial airliner in US history! How did this happen? In designing this modified IFF beacon system, someone had forgotten the “Law of Unintended Consequences”. There are many lessons learned from these experiences, but let’s confine them to three main points: 1. In understanding the “Law of Unintended Consequences” we are more able to deal with the consequences of change, even though we cannot know what all the unintended consequences may be. In knowing that we will create unknown process issues whenever we induce change, we will be more prepared to deal with them when (not if) they occur. 2. There is no free lunch. Without exception, when we change systems in an effort to improve operations, we will create other problems, which will necessitate active and insightful intervention. If this is true, and it most assuredly is, we should do a risk-reward analysis “before” making changes to more fully vet the question “should we do this”. If the process changes we plan to employ are not mandatory, we may choose to leave well enough alone. “If it ain’t broke, don’t fix it!” 3. Never, ever forget the human aspects of change. Too often our focus is on technology with inadequate attention given to how new processes affect those who have to operate the technology. Man-machine systems engineering has to be a central consideration – not only how we interface with the technology, but how does the technology change the working relationships within the team. We cannot know the future, but we can anticipate its nature. Remember and apply the “Law of Unintended Consequences” whenever contemplating process changes. Doing so may not guarantee success, but not doing so will ensure failure. Be safe! About the Author: Captain Alan Price was a founder and leader in Delta’s Human Factors Program (Crew Resource Management – CRM), and led “In Command” for five years before becoming Chief Pilot for Delta’s Atlanta pilot base. He is a retired USAF Lt. Col. and Command Pilot, and works with airlines, hospitals and other team-centered organizations to utilize teamwork, communication, and leadership skills to serve the passenger, the patient – the customer. He is the founder of Falcon Leadership, Inc. and can be reached on Twitter or by phone at 678-549-6858.

You May Also Like