Scientists at Technical University of Kaiserslautern and Freie Universität Berlin Analyzed Interactions between Humans and Technology based on a Plane Crash in 2009
‘ 021/2017 from Jan 30, 2017
When Air France Flight 447 crashed in the Atlantic in June 2009, 228 people were killed, apparently due to a technical malfunction. The cause was actually the interaction between humans and technology. In a recent study, Professor Dr. Gordon Müller-Seitz from the Technical University of Kaiserslautern and Dr. Olivier Berthod from Freie Universität Berlin investigated how the complex technology on board the aircraft, which is intended to ensure the safety of the flight, can actually lead to the opposite outcome, if pilots get so stressed by a warning that they react the wrong way. The researchers advised caution in dealing with technical systems and recommend clear management structures in the cockpit.
During the flight over the Atlantic from Brazil to France, it must have seemed to the pilots that they were looking at a black screen. "It was the middle of the night, and everything around them was dark. They had to rely on the autopilot and the on-board system, so they could not see the actual position of the aircraft," said Professor Müller-Seitz, who does research on strategy, innovation, and cooperation at the Technical University of Kaiserslautern.
When the instruments suddenly showed false data regarding the aircraft speed, the copilot with the least amount of experience sat at the helm. The second copilot was next to him. The captain had left the cockpit for a break. The copilot immediately started a climb, which many later considered a panic reaction. In their report, however, the experts were not sure whether the copilot had reacted as he did because the incorrect data were indicating a descent. "An error was reported because the measurement sensors were frozen and provided incorrect data," said Professor Müller-Seitz. As a result of the climb, the aircraft was maneuvered into such an extreme angle that it stalled. "In the meantime, the sensors had thawed and were providing the correct data, but the pilots didn’t recognize what was happening. Very soon it was too late to control the plane, and it crashed into the ocean a short time later," said Olivier Berthod, who does research on organization theory at Freie Universität Berlin.
The two researchers are interested in so-called sociomateriality. "This is about the interplay between humans and technology," said Müller-Seitz. "And how the actions of humans and machines can react to each other and thus escalate what is happening," added Berthod. In their study, they analyzed the aircraft’s black boxes, the recordings of conversations in the cockpit, and other documents. "Examples like these show the consequences, when humans rely too completely on automated systems and lose track of what is going on," said Berthod.
"The copilots were overwhelmed with the situation and were under great emotional strain. They lacked the necessary experience and immediately reacted with stress and panic, instead of remaining calm and looking for a solution as a team," continues Müller-Seitz. "Clear management structures and a regulated organization are imperative in such situations. They prevent hasty action and mistakes."
It is also important to be familiar with such complex autonomous technology. "That is the only way to understand it and recognize what is causing possible error messages and how users can possibly escalate errors through their actions," says Berthod. In Airbus airplanes in particular, the technology in the different types of aircraft is always the same. The cockpits are designed in such a way that the pilots are separated from the rest of the plane and can easily lose any sense of the outside world. "Our findings indicate that it would be important in situations like this for users of technology to have the feeling that they are interacting with their environment. It is also important for the pilots to be aware of the physical forces that they sometimes have to deal with," continues Müller-Seitz.
In their study the two scientists call for reflected vigilance in handling technical systems. This applies not only to pilots, but also to other professional groups working with similar systems, for example, stock market brokers, whose computer programs deal with millions of stock exchanges at the same time, or financial manager, who are responsible for billions of dollars (euros) in investments.
The study "Making Sense in Pitch Darkness: An Exploration of the Sociomateriality of Sensemaking in Crisis" was published in the prestigious Journal of Management Inquiry. DOI: 10.1177/1056492616686425
The photos may be downloaded by members of the media and used free of charge in connection with reporting on this press release and provided that due credit is given.