Disclaimer! This
is NOT an opinion piece, but rather a collection of various readings
and clippings which
serve to spur further exploration in the topic. These are not full
articles but simply excerpts from the bulk of reading material that is
available. As much
citation and references were taken with regards to the topic. Legitimacy
and
accuracy of the clippings are read at your own discretion.
Milgram Experiment
The power of Strong Authority
ABSTRACTThe power of Strong Authority
In this article the authority system in the airplane cockpit is related to thirty year old authority studies of Stanley Milgram. Human errors made in the cockpit are found similar to those made in the authority experiments. It is argued that up to 20% of all airplane accidents may be preventable by optimizing the monitoring and challenging (M&C) of captain errors by the first officer.
INTRODUCTION
In a hierarchical organization, the boss's authority in the work function can be more or less absolute. In 1963, the eminent social psychologist Stanley Milgram measured the strength of the United States society authority. He found that it was about much stronger than expected - a psychology experimenter was able to make subjects carry out orders that led to the simulated injury and death of a confederate.
Such strong authority tends to create situations in which errors made by authorities will not be corrected. In particular, this is the case in the airplane cockpit: a disproportionate number of accidents occur with the captain flying erroneously and the first officer failing to monitor and challenge the captain errors.
We make the case that any lack of monitoring and challenging of the captain by the first officer is due to the already well documented difficulty of monitoring and challenging authority in our society. The Milgram experiments are described briefly, specific connections between the experiment and the authority structure in the airplane cockpits are made and using this frame work an accident is analyzed more closely using a cockpit voice recording.
Aviation organizational norms include the individualistic thinking from the historical period of the single-pilot planes. This tradition devalues the first officer. Thus, the institution of the first officer is "not fully developed," and the latter plays a "distinctly secondary role".
SOCIAL PSYCHOLOGY FINDINGS:
THE DIFFICULTY OF CHALLENGING STRONG AUTHORITY
THE DIFFICULTY OF CHALLENGING STRONG AUTHORITY
There are four of Milgram’s findings that can help shed light on inadequate monitoring and challenging in the airplane cockpit:
1. Excessive Obedience: Milgram found that most people can be made to inflict intense pain and even kill the learner.
2. Hesitant Challenging: The teacher’s objections to giving the learner electrical shocks were often hesitant and easily overruled by the experimenter’s replies, such as telling the teacher that “the experiment requires that you continue.”
3. Lack of Monitoring: The teacher accepts the authority’s definition of the situation, which does not include the choice of disobedience but only the necessity of continued obedience. Indeed, in the Milgram experiment not one out of almost a thousand teacher-subjects came up with an interpretation leading them to call the police or free the learner (Zimbardo, 1974).
4. Physical Closeness Matters: The strength of the authority of the experimenter was found to be higher the closer the teacher was to the experimenter.
In addition, there is the Milgram Prediction Error: It was shown that predictions (done by psychiatrists, graduate students and faculty in the behavioral sciences, college sophomores, and middle-class adults) underestimate the rate of obedience to authority by a factor of a hundred (Milgram, 1974)! This Milgram Prediction Error, which remains the same, keeps organizations from addressing the issue of how to protect against erroneous authority.
THE DIFFICULTY OF CHALLENGING AN ERRONEOUS CAPTAIN
Experimenter = Erroneous Captain
Teacher = Co-Pilot/First Officer
Patient = Everyone else in the airplane. Passengers
There are similarities between the Milgram experimental situation and the behavior in the cockpit during distress. We make a simple correspondence between the Milgram experiment and the cockpit dynamics: the role of the experimenter is taken by the erroneous captain, the teacher is the first officer, and the harm to the learner and everybody else is the airplane crashing.
Observers of behavior in the aviation field have noted the tendency of the captain-first officer relationship to be too authoritarian in many instances. Ginnett (1993) writes about the tendency of the first officer not to question the captain (here, and later in other examples, I have inserted the applicable findings of Milgram, mentioned above, in square brackets):
The authority dynamic surrounding the role of the captain must be extremely powerful. . . . [and] has resulted in crew members not speaking up when necessary [Hesitant Challenging]. . . . This inclination may also result in excessive psychological dependence on the captain as leader to the extent that individual contributions to problem-solving are neither voiced not attempted [Lack of Monitoring].
Lack of Monitoring and Challenging most common cause in 80% of the accident sample
In 1994 the NTSB (1994b) reviewed all serious airplane accidents between 1978 and 1990.
The NTSB found that after procedural errors, errors of the type “monitoring/challenging” were the most common, occurring in 80 % of the accident sample. These were errors in which the non-flying crew-member (the first officer in 81-87% of the cases) did not properly monitor and challenge the flying crew-member when errors were committed. Usually the errors that should have been monitored or challenged were listed as causal or contributing to the accident.
Using this data we can calculate how many accidents are related to inadequate monitoring and challenging. According to the NTSB in 19 of the 37 accidents a monitoring/challenging error followed a causal error. Since the initial pool consisted of 75 accidents, approximately 25% of all accidents could have been prevented by better monitoring and challenging. Keeping in mind that in 81-87% of all the accidents the captain was the flying pilot, about 20% of all accidents could have been prevented if the first officer had better monitored and challenged the captain.
The study recognized that a common factor in accidents was a tactical error by the Pilot flying that had not been effectively challenged by the PNF, and that this was is proportionately far more common when the Captain was the PF. It also referred to NASA's 1979 full mission simulations, which documented that sound decision-making is more difficult for the Captain when he also acting as PF.
In discussing tactical errors, the 1994 study noted that when a course of action needs to be modified, "a captain/flying pilot must first perceive a need to change, then must alter his or her own current plan and behavior. The decision to change a course of action may be inhibited by overconfidence in ability or the earlier decision to engage in the ongoing course of action. These dynamics probably were relevant in eight accidents involving a failure to execute a go-around during unstabilised approaches."
The study continued, "The tactical decision/errors of omission may be particularly difficult to catch, especially for first officers. In monitoring and challenging a captain's tactical decision error, a first officer may have difficulty both in deciding that the captain has made a faulty decision, and in choosing the correct time to question the decision. A first officer may be concerned that a challenge to a decision may be perceived as a direct challenge to the captain's authority.
For example, challenging a captain's failure to execute a go-around may be much more difficult for a first officer to do, in a timely fashion, than challenging a straightforward procedural error whose correction is unarguable, such as failure to turn on a transponder prior to takeoff.
The absence of action (error of omission) may not call attention to itself as an error as readily as an error of commission. Also, in many situations there may be a period of seconds or minutes when action could be taken. Thus, there may be no distinct signal or cue that now is the time to speak up about another crew member's failure to act, and a challenge may be deferred in hope that the error will be corrected soon."
A 2010 NASA study (Dismukes and Berman) points out the continuation of this problem. Its discussion of "deviation trapping" notes that "Captains in the monitoring pilot role were more than twice as likely to trap deviations made by the flying pilot than first officers in the monitoring pilot role (27.9% vs. 12.1%). This is consistent with [other quoted] flight simulation research showing that captains were more likely to challenge first officers flying the aircraft than vice versa".
Although these studies do not specifically use the phrase, this is what is now commonly called the "cross-cockpit authority gradient". The NTSB made extensive and detailed recommendations about training the need for extensive Crew Resource Management training to overcome these problems which it clearly considered to be one of the most fundamental issues affecting airline safety.
Sources:
https://mafiadoc.com/applications-of-the-milgram-experiments-authority-field-cogprints_59d4628e1723dd5b7569f98e.html
http://picma.info/?q=content/monitoring-and-challenging-failures Birnbach, R., & Longridge, T. (1993). The regulatory perspective. In E. Wiener, B. Kanki, & R. Helmreich (Eds.), Cockpit resource management. (pp. 263-281). San Diego, CA: Academic Press.
Bryant, A. (1994, Nov. 19). Chastened, T.W.A. tries again; business plan built on hope is revised. New York Times, p. 17N.
EDWARDS, E. (1975, October). "Stress and the Airline Pilot." Paper presented at British Airline Pilots Association Medical Symposium, London.
Ginnett, R. (1993). Crews as groups: Their formation and their leadership. In E. Wiener, B. Kanki, and R. Helmreich (Eds.), Cockpit resource management. (pp 71-98). San Diego, CA: Academic Press.
Helmreich, R., & Foushee, H. (1993). Why crew resource management? Empirical and theoretical bases of human factors training in aviation. In E. Wiener, B. Kanki, & R. Helmreich (Eds.), Cockpit resource management. (pp 3-45). San Diego, CA: Academic Press.
Merritt, A., & Helmreich, R. (1996). Human factors on the flight deck: The influence of national culture. Journal of Cross-Cultural Psychology, 27, 5-24.
Milgram, S. (1974). Obedience to authority: An experimental view. New York: Harper and Row.
National Transportation Safety Board (1994a). Controlled collision with terrain: Northwest Airlink Flight 5719, Hibbing, Minnesota, December 1, 1993. Washington, DC.
National Transportation Safety Board (1994b). A review of flightcrew-involved major accidents of U.S. air carriers, 1978 through 1990. Washington, DC.
17
Wiener,E., Kanki, B., & Helmreich, R. (1993). Cockpit resource management. San Diego: Academic Press, Inc.
Zimbardo, P. G. (1974). On “Obedience to Authority.” American Psychologist, 29, 566-567.
18
APPENDIX I
The following checklist is derived from Normal Procedures, Aircraft Operating Manual - DC-9 revision 10 (9/4/95). Additions made by this author are indicated in italics.
Image Sources:
https://www.elephantjournal.com/2011/10/the-spirituality-of-cheeseburgers/
http://picma.info/?q=content/monitoring-and-challenging-failures
https://www.linkedin.com/pulse/20140217220032-266437464-asiana-airlines-sorry-captain-you-re-wrong
No comments:
Post a Comment